Concept of APD & API ?

Can you please tell me the concept of APD & API.

hi yaram,
The Analysis Process Designer (APD) is the application environment for the SAP data mining solution.
The APD workbench provides an graphical interface that enables you to visualize, transform, and deploy data from your business warehouse. It combines all these different steps into a single data process with which you can easily interact.
We can graphically visualize the scenario's . Think on the lines of a Process Chain it is just similar to grouping out the logical steps together with the graphical view. We can look at the intermediary solutions after transformations are applied.
Check this link on APD:
http://help.sap.com/saphelp_nw04/helpdata/en/49/7e960481916448b20134d471d36a6b/frameset.htm
For API check this link:
http://help.sap.com/saphelp_nw04/helpdata/en/2e/260a8563b111d4b2ea0050dadfb23f/frameset.htm
regards,
raghu.

Similar Messages

  • Can We use varient in APD

    Hi Friends ,
    Here i am creating 3/4 querys and i want to load the result of these querys results into an transactional ODS ,i want to create varient for each of the query and use in this process .I am not sure that whether we can use Varient in APD Process or not, if any one did such a scenario pl.guid me.
    Thanks in advance

    Hi,
    unfortunately not. You either need to change the query variant within an analysis process, or you need to create copies of your analysis process where you can put in the different query variants.
    You are right, a variant concept for APD would help in quite some scenarios. But it is not there.
    Cheers,
    Thomas

  • Linking 2 WebI reports and passing values

    Hi All,
              I have an issue here.As I have 2 reports in WebI created and report1 will be a summary report and i have a object in report1   which is like a hyperlink like for eg 2001,2002,2003 and when i click on 2001  it should open up the report2 and give the Quarter,Q2,Q3,Q4.
    Regards
    Prashant

    Hi Prashant,
    U will have to use the concept of OpenDocument API.
    Following are the steps for your question/difficulty.
    1. Open a report in webi having an object year.
    2.Open another report and have the objects year and quarter.
    Type  the below link in the Formula toolbar. The syntax is as follows.
    ="<a href=http://Server Name/businessobjects/enterprise115/desktoplaunch/opendoc/openDocument.jsp?sType=wid&sDocName=Name Of The Second Reprt Try save this without space&s&lsSEnterYear:="[Year]">"[Year]+"</a>"
    and in the properties window in the Read Cell Content As choose hyperlink.
    You will get exactly what your ver looking for.
    Hopefully this solves your problem.
    Regards,
    Bernard.
    Edited by: Florencio Sequeira on Jun 20, 2008 2:24 PM

  • Hi how to use treeMap class???

    hi i m just trying to use treeMap class but i m just a newbie and don't know some of
    the concepts. i opened API and read it thorouhgly, but,,,when i use put() method,
    put(Object key, Object value) //don't know..
    i don't know what does 'key' mean...what should i put here,,and what is this used for..
    anyway i don't know what is the value used for and i just don't know about treeMap
    as well..when do we use the treeMap class and could anyone tell me about these two
    parameters? coz im just a bigginer.......
    thanks guys........

    Maps are used for mapping a key to a value.
    For instance, say you have a Student class that has properties like studentID, lastName, firstName, schedule, etc. If you want to be able to access students by their ID, you might do something like this (mix of pseudocode and Java): for each student {
        map.put(student.getId(), student);
    } Later, say your app asks the user which student he wishes to look at, and the user enters "123abc": String id = readStudentIdFromUser(); // user enters "123abc"
    Student student = (Student)map.get(id); For more info... http://java.sun.com/docs/books/tutorial/collections/

  • [svn:bz-trunk] 19459: Security API change for auth sync sample/ concept to work in WebLogic, WebSphere.

    Revision: 19459
    Revision: 19459
    Author:   [email protected]
    Date:     2010-12-17 10:15:23 -0800 (Fri, 17 Dec 2010)
    Log Message:
    Security API change for auth sync sample/concept to work in WebLogic, WebSphere.
    Adding the PrincipalConverter interface
    Implement the converting principal in WebLogic and WebSphere login command
    Modified Paths:
        blazeds/trunk/modules/opt/src/weblogic/flex/messaging/security/WeblogicLoginCommand.java
        blazeds/trunk/modules/opt/src/websphere/flex/messaging/security/WebSphereLoginCommand.jav a
    Added Paths:
        blazeds/trunk/modules/core/src/flex/messaging/security/PrincipalConverter.java

    Thanks for the reply dood... i've found the solution after several tries... i had to set the channel from the actionscript instead of depending on the Service-config.xml file like the following.. then it worked..
    var cs:ChannelSet = new ChannelSet();
    var chnl:Channel = new Channel();
    var customChannel:Channel = new AMFChannel("my-amf", "http://localhost:8080/somehting/messagebroker/amf");
                    cs.addChannel(customChannel);
    consumer = new Consumer();
    consumer.channelSet = cs;

  • I am in the process of doing a Proof Of Concept / Evaluating products that can help us build a Java Application to Convert a PDF document to a Searchable PDF.   I wanted to check is there any simple JAVA API from Adobe to achive this ? Any direction in th

    I am in the process of doing a Proof Of Concept / Evaluating products that can help us build a Java Application to Convert a PDF document to a Searchable PDF. 
    I wanted to check is there any simple JAVA API from Adobe to achive this ? Any direction in this regard is greatly appreciated.@

    You can achieve this using LiveCycle PDF Generator JAVA API. You can find required code here:
    Adobe LiveCycle * Quick Start (SOAP mode): Converting a Microsoft Word document to a PDF document using the Java API
    In parameters:
    //Set createPDF2 parameter values
    String adobePDFSettings = "Standard";
    String securitySettings = "No Security";
    String fileTypeSettings = "Standard OCR";
    "Standard OCR" file type setting will run OCR on input pdf. In the code, instead of doc file provide a pdf file. Resultant pdf will be searchable PDF i.e OCRed PDF.
    Feel feel to ask any further questions.

  • Using Brio with VB but without the Brio VB API

    Hi there. We're currently attempting to integrate our product (a website) with Brio. For the time being it's only being done for a demo to gauge whether are customers are interested in the functionality or not, so it may or may not actually happen for the version proper. Because of this our company is unwilling to invest in the API just yet, but still wants the work done (the usual developer's conundrum). I'd like to know is it possible to call into Brio, generate a report and embed the results on a webpage without using the VB API supplied by Hyperion? Does anyone know how to do this? It's being done for a demo, so efficiency and performance aren't yet relevant - just more or less a hacked version that can act as a proof of concept. I'm personally new to Brio, so I don't really know how to go about it. Do we need to get the VB API immediately, or can we wait 'til we know we need to write a version that we can sell.Basically, the task is to pass a customer ID into a .bqy file when the user selects an option from a dropdown, and display the results in a frame on the webpage.Any thoughts?

    This kind of questions are sometimes posted. To use GPIB from .NET managed environment, the easiet way is use VISA COM software. The VISA COM software is available on every PC which installs NI-VISA 3.0. I posted a C# example using VISA COM at the following post. Basically the approach is the same for VB.NET and VC++.NET.
    http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HOID=50650000000800000047A30000&USEARCHCONTEXT_CATEGORY_0=_26_%24_13_&USEARCHCONTEXT_CATEGORY_S=0&UCATEGORY_0=_26_%24_13_&UCATEGORY_S=0
    Makoto

  • JAVA API to add header and footer in MS Word

    Hi,
    please help me out, I am hanging with this concept.
    The problem is I need to add a headers in ms word file using java. I come to know that the API is support for this task is 1.apache POI 2.jacob
    I dont know which class file I need to use to do this, If anyone knows about this please let me know or send the URL's
    Thankyou,
    Baskaran.k

    ya i have used POI but i don know the class file in POI. i just now trying jacob., can u send the API and Class file to add header and footers in ms word.

  • Excel Function to query BW using Web Services API

    Hello,
    I need to write an Excel function as an .XLA that will do the following:
    It will be invoked like any other Excel function. E.g. =GetCost(param p1, param p2)
    This function queries an InfoCube and must return a single numerical value for the Cell in Excel that the function is used in.
    I have never done anything like this before. I have done a decent amount of reading to discover the best way to attack this. It seems like the best way to create this functionality is to:
    1. create a web service in a BW function group with a function module that is RFC enabled.
    2. I'm thinking the function module will use native SQL to query the InfoCube and pull the value and then somehow pass this value to a web service.
    3. The value in the web service will be accessed through it's API in Excel/VBA.
    I am not sure if this is best way to do this, or if it is even possible to do it this way. I was wondering if anyone could tell me if I am heading down the right path and possibly direct me to a tutorial or other information that would aid in accomplishing this. I have yet to find some type of proof of concept from beginning to end on how to do something like this. Any help would be greatly appreciated.
    These are the documents I have found insightful so far:
    How to build an XLA: http://www.fontstuff.com/vba/vbatut03.htm
    If you can read a table and dump it into Excel then you can query an InfoCube? https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/c043d836-166c-2910-b99e-ae3633dec547
    How to view a webservice homepage: http://help.sap.com/saphelp_nw04/helpdata/en/1c/472e22c45cc94599ab3725bc9558d2/content.htm
    How to create a webservice:
    http://help.sap.com/saphelp_nw04/helpdata/en/9b/dad1ae3908ee44a5caf57e10918be9/frameset.htm
    What I am missing is how to pass data from the function module to the webservice, and from the webservice to the Excel/VBA code.
    Thanks!
    -Gary

    /bump

  • Concept of groups vs concept of roles

    Hi!
    I'm designing an LDAP structure mainly for authentication and authorization of users. I want to use the LDAP server for applications, intranet (different platforms like linux, NT, ...) and portals.
    I read the Admin guide about groups and roles and found, that there aren't that many reasons for using roles instead of groups. The only real difference is (as I understood) that when using roles, I don't have to search for the the groups a user is member of, because every user contains the nsrole attribute with all the roles he is member of.
    One big reason for not using roles is, that they are quite specific for iPlanet Directory Server. If one ever changes to another product (for example OpenLDAP) the roles concept may or may not be the same. When using groups I don't have that problem.
    (If my information about that is incorrect please conradict!)
    A mixture of groups and roles is a quite bad idea because if I put a group in a role, the "nsrole" attribute is added only to the group but not the the members of the group, so if I use roles, I should stick to them and should not use any groups.
    As I told at the beginning, I am planning an LDAP structure. I don't have any "real life LDAP-experience" so if your experience is different, please tell me.
    Thanks in advance for your opinion!
    Florian

    1. Why there could be a problem without scopes in
    groups. If I have two companies and each of them has
    a group "employees". Two companies would probably be
    separated in two different subtrees, so I just use a
    dynamic group, where I can specify a subtree where
    groupmembers can be located or I use static groups,
    where I define each entry.You see, you had to make a choice on which group type you could use - not because one was more convenient for defining members for the problem at hand, but because only one would work at all.
    One thing I did not mention about roles advantages: they all work the same way - if a new role type were invented, applications written to work with roles prior to the new role, would still work with that role type. Groups types are so different that forward compatibility is not possible - mostly because to even use groups, applications have to do all the work to do common things like, enumerate the group, enumerate the groups an entry belongs to, test for group membership etc.
    >
    2. The coding logic for group evaluation with dynamic
    and static groups and even mixtures of it is quite
    complicated, it is much easier to ask an entry for a
    roledn and thats it, but do most clients support
    roles? Probably not. But then roles have not been around as long. I don't have any hard data on how many apps use roles - you would be surprised how hard it is to get that data for a developer.
    As far as I know roles are not used in any
    other LDAP Server. Well, the Sun DS, and the Netscape DS (which admittedly were once the same thing) both support the same roles.
    So you can optimize an
    applications implementing a role based queries, but
    if you have a OpenLDAP environment you also need a
    possibility to use groups. Talk to the OpenLDAP people about that. I believe they (at one time at least) decided to support the Netscape slapi interface - roles have interface components in that api.
    I do understand what you are saying - there isn't an RFC, so other servers don't support roles. Well, I'm sorry, I never got around to it. To be perfectly frank, a lot of LDAP RFCs/Drafts merely describe some proprietary mechanism which other servers never adopt. Some even describe mechanisms that nobody has ever implemented.
    When it comes down to it, it is only you who can decide whether being able to move to OpenLDAP or some other server without any reimplimentation is an important consideration. Every server will have features not supported by others, and if your choice is to use only those that are commonly supported, then that is your choice.
    Roles will allow much less complex coding in order to use them and they are much faster than equivalent client side operations, but the price is non-comformance with other servers. But when that non-conformance simply boils down to entries which merely "describe" the groups without adding application level functionality - how much have you really lost? Well, until you need to change server vendor you have only gained, and then you'll need to put in the effort you saved ealier.
    On the other side, what
    application do support roles right now? (I really
    don't know)Apart from applications by vendors that also supply DS I don't know either - but support for features such as this need to come from customers of those products. It is surprisingly simple to add support for roles in a product (for most it will almost be free) - much simpler than for groups.

  • How to determine host status in a task via python api

    Hello all,
    using the python api I am having some issues trying to determine the status of a host within a task. For example I have a job, with one task and 10 hosts associated with that task. Eight hosts finish the task, 2 fail. However via the api I can only seem to determine the status of the job and the task. What I want is to be able to generate a report that basically matches what you see via the gui (ie 8 hosts ok, 2 hosts failed). The data structure OnStageTaskData gives me a status and a list of hosts, but not a status for each host. (Note I am not using any of the depricated functions/data structures). Currently my code looks like (minus the api init, etc)...
    # list all the jobs for last week
    joblist = api.GetJobs()
    for job in joblist:
    # if the job occured in the last week
    if job.c_time > reportstarttime:
    print "\nJob name %s Time %s " %(job.name, time.ctime(job.c_time))
    try:
    jobdetails = api.GetOnStageJob(job.id)
    except COsApiJobNotFoundException:
    print "Can't find job %s details" %job.name
    continue
    for tasks in jobdetails.job_data.tasks:
    for hostid in tasks.task_data.target:
    try:
    hostinfo = api.GetHostGroup(hostid)
    except OsApiHostNotFoundException:
    print "Can't find host details %s" %tasks.task_data.name
    print "Hostname %s Status %s" %(hostinfo.name, tasks.status)
    which generates output like...
    Job name chg233146-sol9 Time Sun Jan 13 10:31:42 2008
    Hostname tacnomsrv02 Status Failed
    Hostname tacpthsrv01 Status Failed
    anyone have another way to doing this or suggestions? Or is this kind of info not available via the api? Thank you.

    Hi ConnectSolutions,
    As you correctly pointed out, there is some bug in our code
    that keeps the room active and you dont receive any events or
    notification if you are entering as guest and waiting and the host
    arrives. Also, since you havent entered yet i.e your role is still
    5 i.e. UserRoles.LOBBY , you can't access any of the UserManager's
    collections and will get a length of 0 always. We will be fixing
    this with priority .
    But you can get around the problem as of now, by having a
    small shared model of yours ( any collectionNode or sharedModel
    will do) where you create a node and publish a message on it from
    the Owner's side whenever the owner/host enters to notify everyone
    that he has entered. Just remember to set the accessmodel of
    NodeConfiguration of node on which you are publishing to LOBBY ie.
    role = 5 , and make the publishModel = 100 so that only owners can
    publish on this node. In this way , any users waiting will be
    receiving this message and will know the host has arrived. If you
    can't get this sharedModel concept to work, let me know. I will try
    to run on my side.
    And as of our side regarding actual fix, we will fix this use
    case and put in the next drop of SDK and also update in forum about
    it.
    On the host side though, he is always notified when he enters
    if there are pending users knocking to enter. See the KnockingQueue
    example in case you want to explore that.
    Thanks
    Hironmay Basu

  • Best practice to monitor 10gR3 OSB performance using JMX API?

    Hi guys,
    I need some advice on the best practice to monitor 10gR3 OSB performance using JMX API.
    Jus to show I have done my home work, I managed to get the JMX sample code from
    http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/jmx_monitoring/example.html#wp1109828
    working.
    The following is the list of options I am think about:
    * Set up: I have a cluster of one 1 admin server with 2 managed servers, which managed server runs an instance of OSB
    * What I try to achieve:
    - use JMX API to collect OSB stats data periodically as in sample code above then save data as a record to a
         database table
    Options/ideas:
    1. Simplest approach: Run the modified version of JMX sample on the Admin Server to save stats data to database
    regularly. I can't see problems with this one ...
    2. Use WLI to schedule the Task of collecting stats data regularly. May be overkill if option 1 above is good for production
    3. Deploy a simple web app on Admin Server, say a simple servlet that displays a simple page to start/stop and configure
    data collection interval for the timer
    What approach would you experts recommend?
    BTW, the caveats os using JMX in http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/jmx_monitoring/concepts.html#wp1095673
    says
         Oracle strongly discourages using this API in a concurrent manner with more than one thread or process. This is because a reset performed in
         one thread or process is not visible to another threads or processes. This caveat also applies to resets performed from the Monitoring Dashboard of
         the Oracle Service Bus Console, as such resets are not visible to this API.
    Under what scenario would I be breaking this rule? I am a little worried about its statement
         discourages using this API in a concurrent manner with more than one thread or process
    Thanks in advance,
    Sam

    Hi Manoj,
    Thanks for getting back. I am afraid configuring aggregation interval from Dashboard doesn't solve problem as I need to collect stats data of endpoint URI or in hourly or daily basis, then output to CSV files so line graphs can be drawn for chosen applications.
    Just for those who may be interested. It's not possible to use SQL to query database tables to extract OSB stats for a specified time period, say 9am - 5pm. I raised a support case already and the response I got back is 'No'.
    That means using JMX API will be the way to go :)
    Has anyone actually done this kind of OSB stats report and care to give some pointers?
    I am thinking of using 7 or 1 days as the aggregation interval set in Dashboard of OSB admin console then collects stats data using JMX(as described in previous link) hourly using WebLogic Server JMX Timer Service as described in
    http://download.oracle.com/docs/cd/E12840_01/wls/docs103/jmxinst/timer.html instead of Java's Timer class.
    Not sure if this is the best practice.
    Thanks,
    Regards,
    Sam

  • What is the serialization concept in ALE/IDOC?

    what is the serialization concept in ALE/IDOC?

    Hi Srinu ,
    IDoc Serialization means, sending/posting the idocs in sequence.
    We serialize IDocs in the following cases:
    · If you want the Integration Server to process the corresponding IDoc XML messages in the same sequence that it receives them from the IDoc adapter at the inbound channel.
    · If you want the receiver to receive the IDocs in the same sequence that the IDoc adapter sends them at the Integration Server outbound channel.
    The sequence at the Integration Server inbound or outbound channel can only be guaranteed if only IDocs are processed, and not if different protocols (for example, IDocs and proxies) are processed together.
    Do not confuse IDoc serialization using the IDoc adapter with the ALE serialization of IDocs.
    Prerequisites
    · The quality of service EOIO (Exactly Once In Order) must be specified in the message header.
    · The receiver system or the sender system must be based on SAP Web Application Server 6.40 or higher. If this is not the case, the quality of service is automatically changed to EO for compatibility reasons and the message is processed accordingly.
    Procedure
    If you want the Integration Server to process the IDoc XML messages created by the IDoc adapter in the same sequence that the IDocs are sent by your application, proceed as follows:
    · Enter a queue name in your application. You can use 16 alphanumeric characters. The prefix SAP_ALE_ is then added.
    The IDoc adapter checks the prefix and replaces it with the prefix of the corresponding Integration Server inbound queue (for example, XBQI0000).
    If you want the receiver to receive the IDocs in the same sequence that they are sent by the Integration Server using the IDoc adapter, proceed as follows:
    · In the communication channel, select the check box Queue processing for the receiver.
    The IDoc adapter replaces the prefix of the outbound queue (XBQO) with the prefix SAP_ALE_.
    You can display the individual messages in the qRFC monitor of the outbound queue. To do this, do one of the following:
    ¡ Use the queue ID in the list of displayed messages in the monitor for processed XML messages.
    ¡ Use the transaction ID in the list of displayed XML messages in the IDoc adapter.
    ¡ Call the transaction qRFC Monitor (Outbound Queue)(SMQ1).
    To navigate directly to the display of messages in the IDoc adapter, double click the transaction ID of a message in the outbound queue.
    To do this, you must have registered the display program IDX_SHOW_MESSAGE for the outbound queue in the qRFC administration (transaction SMQE) beforehand.
    In both cases, the function module IDOC_INBOUND_IN_QUEUE is called, which enables EOIO processing of the messages. The processing sequence is determined by the sequence of the function module calls.
    Unlike the other function modules (interface versions from the communication channel), with this function module you have to transfer segment types rather than segment names in the data records.
    Serialization of Messages
    Use
    Serialization plays an important role in distributing interdependent objects, especially when master data is being distributed.
    IDocs can be created, sent and posted in a specified order by distributing message types serially.
    Errors can then be avoided when processing inbound IDocs.
    Interdependent messages can be serially distributed in the following ways:
    Serialization by Object Type
    Serialization by Message Type
    Serialization at IDoc Level
    (not for IDocs from generated BAPI-ALE interfaces)
    Serialization at IDoc Level
    Use
    Delays in transferring IDocs may result in an IDoc containing data belonging to a specific object arriving at its destination before an "older" IDoc that contains different data belonging to the same object. Applications can use the ALE Serialization API to specify the order IDocs of the same message type are processed in and to prevent old IDocs from being posted if processing is repeated.
    SAP recommends that you regularly schedule program RBDSRCLR to clean up table BDSER (old time stamp).
    Prerequisites
    IDocs generated by BAPI interfaces cannot be serialized at IDoc level because the function module for inbound processing does not use the ALE Serialization API.
    Features
    ALE provides two function modules to serialize IDocs which the posting function module has to invoke:
    · IDOC_SERIALIZATION_CHECK
    checks the time stamps in the serialization field of the IDoc header.
    · IDOC_SERIAL_POST
    updates the serialization table.
    Check the following link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a66d6507d11d18ee90000e8366fc2/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/78/2175a751ce11d189570000e829fbbd/frameset.htm
    Ex: ADRMAS, DEBMAS(customer)
    ADRMAS, CREMAS(Vendor)
    In this case, Before posting Customer Data or Vendor Data it requires Address Data.
    Rgds
    Sree m

  • Doubts in GeoRaster Concept.

    Hi everybody,
    I have few doubts in GeoRaster concepts.
    I did mosaicing of multiple Georasater objects using "sdo_geor.getRasterSubset()" and able to display image properly. But while doing this I come across few people suggestions. They said that mosaicing multiple rows together in a GeoRaster table is not going produce meaningful results because the interpolation methods wont have access to the data in the adjacent cells at the seams because cell needed exist in a different row (i.e. where two rows of GeoRaster either abut or overlap).
    I assume Oracle takes care of all this. Please suggest wheather my assumption is true or the statement given is true?
    Regards,
    Baskar
    Edited by: user_baski on May 16, 2010 10:49 PM

    Hi Jeffrey,
    Requirements:-
    I have to do mosaicing of 'n' number of Georaster objects. For eg, if table has 4 rows of GeoRaster object, then i have to create single image by mosaicing all the Georaster object based on the Envelope provided. (Note: I have to do this with Queries without using GeoRaster API)
    Workflow:-
    1. Get the connection and table details.
    2. Retrieve necessary information from the db like SRID, MAXPYRAMID, SPATIALRESOLUTION, EXTENT etc. For getting extent, I used SDO_AGGR_MBR function.
    3. With the help of "MDSYS.SDO_FILTER" and bouding box values, I create arraylist which contains raster id's retrieved from raster data table which covers the bouding box value provided in the filter command.
    4. Then I passed bounding box value into "sdo_geor.getCellCoordinate" function and I retrieved row and column number of Georaster image and created a number array which contains starting and ending row/column numbers.
    5. Then I had written a PL/SQL with "sdo_geor.getRasterSubset" function which takes the number array and raster id as input parameters, which inturn returns BLOB object.
    6. I am executing step 5 in a loop with all the raster id's that I got at step 3. For eg, arraylist size is 4, then I will have four BLOB object.
    7. Finally, I creating new image from the BLOB objects after some scaling and cropping based on the individual GeneralEnvelope of each raster id object.
    I had followed all the above steps and successfully created mosaic image.However, few people suggested that mosaicing in this way does not produce meaningful results because the interpolation methods wont have access to the data in the adjacent cells at the seams because cell needed exist in a different row. I assume Oracle will take care of these things. Moreover, they suggested to keep single row in GeoRaster table instead of muliple rows of Georaster object and suggested to use "SDO_GEOR.updateRaster" function to update a part of the raster object and the pyramids are rebuild automatically.
    So Please suggest which is the better way to do mosaicing. Wheather my assumption is correct or not?

  • How to add a polygon to mapviewer using Javascript API

    Hi,
    I am using JavaScript API for the MapViewer.
    Using redlining tool, my application allows user to draw polygon onto the map.
    I would like to have union, difference and intersect tools. I know the back end logic can be done using using JTS or SDO_Union, SDO_difference, and SDO_intersection. but how do i display the result polygon on the map? Note that I am not going to store the polygon in the database.
    Thanks for any ideas.

    Hi,
    If I understand well you are using Oracle Maps. User do some redlining, then you somehow makes FOI polygons. Then you get the coordinates of the polygons and send them to the back-end where you can use spatial functions and get the coordinates of the new polygon.
    You can store those coordinates in i.e. hidden input field on page (if you use JSF it is very simple) and then use JavaScript to parse them and Oracle Maps API to create new FOI polygon and add it to mapviewer object. One disadvantage of this concept is that reloading of page (submit) is necessary.
    Branislav

Maybe you are looking for