Direct Insert or API

I want to create/insert directly into CI_FUNCTION_ENTITY_USAGES (i$sdd_funent) by using appropriate values for all the not-null columns. I can see that ci_function_entity_usages.function_ref comes from ci_functions.irid,
ci_function_entity_usages.entity_ref comes from ci_entities.irid, etc ...
My question is : How do I populate ci_function_entity_usages.id,
ci_function_entity_usages.irid, ci_function_entity_usages.ivid.
Are they primed from some internal sequence or some pre-insert
trigger somewhere?
Is there a better way of doing this than using direct insert eg an api?
Thanks in advance
Ian

At the moment I've no access to any Designer repository.
The idea of API is that for each object there is individual PL/SQL package. For ci_function_entity_usages it should be ciofunction_entity_usage. In this package is type 'data' defined. It has 2 components: i and v. i is record of boolean variables. v is record of values - it maches ci_function_entity_usages view.
To insert new function-entity usage you have to fill in data.v values and set data.i to true for elements you filled in in data.v. Then use ciofunction_entity_usage.ins procedure to insetr record.
You do not have to set ID, ivid or irid - it is set by API.
You can follow example.
Hope it helps. Paweł

Similar Messages

  • Direct insert Vs collection

    Greetings,
    In one of my procedures, I've used collection with commit every 1000 records. It took about 4 hrs to process 14,000,000 records. Then, without changing the logic, I've modified the same procedure and did direct insert and delete statements. It took about 1.5 hrs.
    I was wrong thinking that collection would improve performance as they fetch all at the same time. Is this because of 'commit'? I have also experimented with commit every 10,000 records, even then it didn't improve much.
    Could you discuss further on why slow Vs faster? Thank you,
    Lakshmi

    The rules of thumb (borrowed from Tom Kyte)
    1) If you can do it in SQL, do it in SQL
    2) If you can't do it in SQL, do it in PL/SQL. "Can't do it in SQL" generally means that you can't figure out how to code the logic in SQL and/or the business logic is so complex that it makes the SQL unreadable.
    2a) If you have to use PL/SQL, try to use collections & bulk processing.
    3) If you can't do it in PL/SQL, do it in a Java stored procedure
    4) If you can't do it in a Java stored procedure, do it in an external procedure
    Collections are never preferred over straight SQL from a performance standpoint, but they may be easier to code for more complex business rules.
    Justin

  • DIRECT INSERTS

    Dear forum,
    We can directly insert to table in bulk like
    insert into target as select * from source1 a,source_2 b where a.s1=b.s2
    From the perfomance itself we know that above is n't a record by record insert instead it is bulk insert .
    1) Experts pls tell me ..how oracle manages above query
    2)How many records it send to db @ a time
    Thanks

    hi keith,
    Pls see my query
    insert into pkg_tmp
    select           find_key(VALUE,8) ,
              (VL_ID)mod10 ,
              VALUE_ST     /* */     
    from
    SOURCE_ATTR d,
    STOCK c,
    CONTENT b ,
    PACKAGES a /* */
    where
    a.PACKAGE_ID = b.PACKAGE_ID
    and
         b.TP_ID =1
         and
         b.VL_ID = c.RRS_ID
    and
         c.RRS_ID = VAL_ID
         and
         ATTR_ID = 11
    Tables in from clause contains records in millions ( a = 33M ,b=37M,C=37M,D=122M)...
    I need to select the query as shown in query and finally result to be loaded into pkg_tmp table.
    Thanks..

  • Direct Insert path

    I want to insert large data from dblinks using direct insert however as rollback segment is quite small i need to commit after every 2000 rows or so.
    insert /*+ append */ into abc_monthly@dblink select * from abc partition(ODFRC_20040201);
    any help guys !
    or any other way round for faster insert than this ?
    Edited by: Avi on Aug 12, 2009 9:41 AM

    Hi,
    Don't shoot the messenger but your append hint will be ignored:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:549493700346053658

  • How to directly insert pictures inside a wiki page without the need to enter the picture's properties

    I am working on a publishing site collection using the enterprise wiki template. Add when I want to add a new Picture from my Pc to the wiki page, I will be prompted with the following dialog:-
    So can anyone advice how i can bypass this dialog and directly add the imag to the wiki page?
    i tried modifying the "Images" content type by hiding the above fields, but still i will be prompted to enter the following :-

    Hi,
    According to your post, my understanding is that you want to directly insert pictures inside a wiki page without the need to enter the picture's properties.
    Per my knowledge, you can use javascript to bypass Edit Properties Page while uploading pictures.
    Here are some similar articles for your reference:
    How to bypass Edit Properties
    Page while uploading documents in a document library
    How to bypass Edit Properties Page while uploading documents in a
    SharePoint and .NET: How to bypass Edit Properties Page or Skip EditForm.aspx Page while uploading documents in a document library
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Direct provisioing using API in OIM 11g

    Hi Experts,
    I am facing couple of issues.
    *1)* I am trying to provision a resource direcly using API's in OIM 11g. Here I do not have any object form for this resource but I have a process form with some pre-population adapters.
    And I am trying to use the below code for direct provisioining.
    Hashtable objectHash = new Hashtable();
    objectHash.put("Objects.Name", objectName);
    tcResultSet objectResultSet = objIntf.findObjects(objectHash);
    long objectKey = objectResultSet.getLongValue("Objects.Key");
    com.thortech.xl.vo.ResourceData data = userIntf.provisionResource(userKey, objectKey);
    long userObjectInstanceKey = Long.parseLong(data.getOiuKey());
    long objectInstanceKey = Long.parseLong(data.getObiKey());
    And I am getting nulls for the attributes userObjectInstanceKey & objectInstanceKey .
    Please let me know how to provision a resource which has no object form but has some pre-population adapters using API .
    *2)* I am trying to provision a resource direcly using API's in OIM 11g. Here I do have an object form for this resource which has one of the attribute as of type lookup.
    ResourceData data = userIntf.provisionResource(userKey, resourceKey);
    long userObjectInstanceKey = Long.parseLong(data.getOiuKey());
    long objectInstanceKey = Long.parseLong(data.getObiKey());
    Hashtable objectHash = new Hashtable();
    objectHash.put("UD_ADGROUP_NAME",groupName);
    In this case I am getting objectInstanceKey properly but it is not seeting lookup field value but it is setting all other fields on the object form correctly.
    How to set a field of type lookup on the object form while provisioing a resource directly using API's.
    Thanks a lot for your help.

    947670 wrote:
    Hi Pallavi,
    I am not populating any object form. I am trying for direct provisioning a resource thru OIM API's.
    Hence, I need populate all of the process form fields inorder to skip the approval flow. So, I was using setProcessFormData method.
    Here is what happening.
    1) My resource has a request dataset which has only one field called "Group Name".
    2) My resource has a process form with the fields name UD_GROUP_NAME, UD_GROUP_DESCRIPTION, UD_GROUP_OWNER.
    3) When I use the below code, I was able to populate the fields UD_GROUP_DESCRIPTION, UD_GROUP_OWNER (Because pre-populate adapters are getting invoked) as they did not exist on the request data.
    tcFormInstanceOperationsIntf formInstanceOps=Platform.getService(tcFormInstanceOperationsIntf.class);
    ResourceData data = userIntf.provisionResource(userKey, resourceKey);
    long userObjectInstanceKey = Long.parseLong(data.getOiuKey());
    long objectInstanceKey = Long.parseLong(data.getObiKey());
    Hashtable objectHash = new Hashtable();
    objectHash.put("*UD_ADGROUP_NAME*",groupName);
    formInstanceOps.setProcessFormData(objectInstanceKey, objectHash);
    4) I am having this issue only with the fields that are exist on the request dataset. Since, UD_GROUP_NAME exist on the request dataset, even if I try to set some value in the process form, it is not taking.
    Using API's, I am not able to populate any of the attributes on the process form that are exist on the request dataset.
    How to solve this issue.1. Check the process form field name.
    2. Use tcUserOperationsIntfAPI Method provisionObject(userKey,ObjectKey)
    userIntf.provisionObject(userKey, objectKey);
    3. Get the process-instance key.
    tcResultSet objResultSet = userIntf.getObjects(userKey);
                   int objCount = objResultSet.getRowCount();
                   for (int count = 0; count < objCount; count++) {
                        objResultSet.goToRow(count);
                        if (objResultSet.getStringValue("Objects.Name").equalsIgnoreCase(resourceObjectName)){
                             processInstanceKey = objResultSet.getLongValue("Process Instance.Key");
    4. Use tcFormInstanceOperationsIntf API method setProcesFormData(processInstanceKey,dataMap)
    Hope this helps you.

  • How to load data to a "clustered table" quickly?  direct insert not work

    We have a single hash clustered table whose size is 45G, and we used the command:
    insert /*+ append */ into t_hashed as select * from source.
    the source is about 30G, it takes a very long time(several days, and not yet completed).
    I dumped the redo log, and find there are lots of redo info about the insert. I have thought it should create much redo as it is "direct path insert", but from the dump info I could say the direct path insert didn't take effect.

    Your assessment is correct. INSERT /*+ APPEND */ does not work with a hash clustered table.
    If you think about how hash clusters work, and how direct load insert works, it should be clear to you why a direct load insert via the append hint can never work.
    How hash clusters work (well, only the part that's relevant to the current discussion):
    Initially, at creation time, the storage for the entire cluster is preallocated, based on the various cluster parameters you chose. Think about that. All the storage is allocated to the cluster.
    How direct load works (well, only the part that's relevant to the current discussion):
    Space already allocated to and used by the segment is ignored. Space is taken from free, unformatted blocks from above the high water mark.
    So, hopefully it's clear how these features are incompatible.
    Bottom line:
    It doesn't work, there's no way around it, there's nothing you can do about it....
    -Mark

  • Direct provisioning through API - OIM 11g

    Hi,
    OIM 11g here. I am trying to use the APIs to make direct provisioning. What i have done till now:
    tcUserOperationsIntf userIntf = (tcUserOperationsIntf)ioUtilityFactory.getUtility("Thor.API.Operations.tcUserOperationsIntf");
    ResourceData rd = userIntf.provisionResource(userkey, objectkey);
    now, in the ResourceData object i have two ids, obiKey and ouiKey. Now i need to extract the process instance key with those numbers. How can i do this?
    Using the userIntf getObjects method i can get the list of objects provisioned, iterate over it and retrieve the process instance key of the object which matches obiKey and ouiKey. Is there an easier method to do this?
    Another question, which one is the process instance key, ORC_KEY or ORC_TOS_INSTANCE_KEY ?
    Last, how do i trigger the task responsible for provisioning given the filled process form?
    thx in advance

    Ok, i guess the process instance key is ORC_KEY.
    Now i am trying to provision through APIs a resource object (say AD User) to an OIM user. I have used the provisionResource(userkey, objectkey) method, but the Create User task is not put in the Resource History (there is only the System Validation Task), and i don't know how to look for it's task id to add it manually.

  • Cursor v/s direct insert

    hi all,
    which one is better...?
    defining a cursor for an insert or using the select stmt in the insert?
    ex1:
    declare cursor c1 as
    select a,b from table1
    begin
    for i in c1
    loop
    insert into table2 values i.a, i.b ;
    end loop;
    end;
    ex2
    begin
    insert into table2 values (select a, b from table1);
    end;
    thnks

    ...and here are the facts:
    SQL> set timing on
    SQL> declare
      2  cursor c is
      3  select *
      4  from all_Tables;
      5  begin
      6  for r in c loop
      7  insert into test values(
      8  r.OWNER                    ,
      9  r.TABLE_NAME               ,
    10  r.TABLESPACE_NAME          ,
    11  r.CLUSTER_NAME             ,
    12  r.IOT_NAME                 ,
    13  r.PCT_FREE                 ,
    14  r.PCT_USED                 ,
    15  r.INI_TRANS                ,
    16  r.MAX_TRANS                ,
    17  r.INITIAL_EXTENT           ,
    18  r.NEXT_EXTENT              ,
    19  r.MIN_EXTENTS              ,
    20  r.MAX_EXTENTS              ,
    21  r.PCT_INCREASE             ,
    22  r.FREELISTS                ,
    23  r.FREELIST_GROUPS          ,
    24  r.LOGGING                  ,
    25  r.BACKED_UP                ,
    26  r.NUM_ROWS                 ,
    27  r.BLOCKS                   ,
    28  r.EMPTY_BLOCKS             ,
    29  r.AVG_SPACE                ,
    30  r.CHAIN_CNT                ,
    31  r.AVG_ROW_LEN              ,
    32  r.AVG_SPACE_FREELIST_BLOCKS,
    33  r.NUM_FREELIST_BLOCKS,
    34  r.DEGREE             ,
    35  r.INSTANCES          ,
    36  r.CACHE              ,
    37  r.TABLE_LOCK         ,
    38  r.SAMPLE_SIZE        ,
    39  r.LAST_ANALYZED      ,
    40  r.PARTITIONED        ,
    41  r.IOT_TYPE           ,
    42  r.TEMPORARY          ,
    43  r.SECONDARY          ,
    44  r.NESTED             ,
    45  r.BUFFER_POOL        ,
    46  r.ROW_MOVEMENT       ,
    47  r.GLOBAL_STATS       ,
    48  r.USER_STATS         ,
    49  r.DURATION           ,
    50  r.SKIP_CORRUPT       ,
    51  r.MONITORING         ,
    52  r.CLUSTER_OWNER      ,
    53  r.DEPENDENCIES       ,
    54  r.COMPRESSION        ,
    55  r.DROPPED
    56  );
    57  end loop;
    58* end;
    SQL> /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:04.07
    SQL> insert into test
      2  select *
      3  from all_tables;
    3718 rows created.
    Elapsed: 00:00:01.07
    SQL> Regards,
    Gerd

  • Getting error when inserting Web API/Table Inteface class in Web Template !

    Hi Experts,
    I am getting error -- Invalid Renderer class: '&1' ZTEST_BWREPORT_ADI
    while inserting the <param name="ITEM_CLASS" value="ZTEST_BWREPORT_ADI"/> in HTML code of the template, having one table web item, before  <param name="ITEM_CLASS" value="CL_RSR_WWW_ITEM_GRID"/> statement to get the customization I have done in ZTEST_BWREPORT_ADI as per table interface.
    Please look into the issue and try to suggest the possible solutions as soon as possible.
    Regards,
    Aditya Srivastava.

    Hi,
    The got the solution which is nothing tricky. I have to just insert <param name = "MODIFY_CLASS" value = "ZTEST_BWREPORTS_ADI"/> after <param name = "ITEM_CLASS" value = "CL_RSR_WWW_ITEM_GRID"> to get the desired result. Though I don't know what might be the reason (since I am very new to BW) but atleast the thing is working.
    Regards,
    Aditya Srivastava.

  • Toplink on existing application

    Hi
    we are trying to use toplink for an existing j2ee application. The existing application uses dao approach (we have plain java object model and whenever we want to persist an object, we call its corresponding dao).Each dao has an insert,update method. The transaction is handled by container, and we uses a stateless session bean for that. Each ejb method execution is a complete transaction. Database connections are maintained by application server (websphere 4) and we use jndi to get hold of connections.
    1.If we migrate to toplink, can we use the same architecture? If not what will be the best approach for this?
    2.Should we use DatabaseSession or ServerSession?
    3.If we use ServerSession, we found that we have to use unit of work for persisting. But unit of work doesnt have direct insert/update api. So can we use dao approach with ServerSession?
    4. The database is db2 and some tables use identity columns. Is there any workaround (like modifying source) for that as we found toplink doesnt support identity columns? .
    thanks
    shibin

    1 - If you just want to use TopLink to take your value objects and push them into the database, and build objects out of result sets, then you could just use DatabaseSession and the write/readObject API. But the value of TopLink is it's concurrency support and caching/locking/sequencing/transaction support. I.e., Unit Of Work. I'd need some time to look at what you have but generally in these situations you could simply plug TopLink into your DAO pattern with DAtabaseSessions and read/writeObject API, but for the long term I'd recommend considering some architecture changes to leverage the UOW and other features.
    2 - Ultimately you want the concurrent and scalable ServerSession, but if you're just looking for a quick plug of TopLInk into your existing app, you might have to use multiple DatabaseSessions. Would need to see your app to offer better advice.
    4 - We do support Identity on Sybase and MS SQL Server, but for some technical reasons dating back several years, we do not support native sequencing on DB2.
    - Don

  • Supplier Conversion :auto_tax_calc_flag error

    Hello everyone ,
    Currently I am working on Supplier conversion in 11.0.3 to 11.5.10.2 .
    In supplier site interface table (ap_supplier_sites_int) getting error for auto_tax_calc_flag when the value is 'L'. I Changed one setup i.e. When I checked the box for 'Use Automatic Tax Calculation' in Payables Options=>Invoice Tax, the error did not occur. But my client is not allowing to change this set up .
    Is there any other we can fix this issue.
    Its very urgent . please reply as soon as possible .
    Thanks in Advance
    Apps4u.

    see actually iam using the customized program and in that we have been given a data file with line type *'LINE'* and given a tax code and the auto accounting is off....so what i did imported only one line with tax code in ra_interface_lines_all and created the 'REV','REC' AND TAX line in ra_interface_distributions_all but when i ran the auto invoice import it gave me *:The accounting distributions for this transaction are linked to the wrong type of line.*
    *(account_class: TAX, line_type: LINE)*Did you check the docs referenced in your other thread? error:auto invoice distribution error
    so what i did i created a TAX line and linked the tax line with invoice line.....as given above example to insert into.....
    then when i ran it gave me Populate either tax_rate_id or tax_rate_code to create a valid tax line..... i hope u understood the requirement.....I still believe you should not use a direct insert, and API should be used instead -- Please log a SR to confirm this with Oracle support.
    Thanks,
    Hussein

  • API for insert Lead into telesales

    How can Insert lead for customer using API in Telesales

    Hi,
    Currently, I believe there are no api's to populate the pa_transaction_control table.
    If this is a new implementation and you are converting the projects from your legacy system for the first time, you should be able to directly insert date into these tables. This table does not have many FK relationships. Mainly depending on the transaction control is set at the Project or the Task level, you can set the values for Project id /Task id.
    I would also suggest that you confirm with Oracle by creating a Tar that there is no API currently available.
    - Vasu -

  • Direct Entitlement API

    Hi,
    I've created a web service based from the direct entitlement api v2 - http://download.macromedia.com/pub/developer/dps/entitlement/direct-en titlement-api.pdf
    Based from my testing, it passed the specification based from the API document.
    All methods were created. The service is a .net web service (.asmx) and methods are below.
    https://<website>/dps/entitlement.asmx/verifyEntitlement
    https://<website>/dps/entitlement.asmx/SignInWithCredentials
    https://<website>/dps/entitlement.asmx/RenewAuthToken
    https://<website>/dps/entitlement.asmx/entitlements
    What Service URL/Service Auth URL should I be using?  I tried using https://<website>/dps/entitlement.asmx/ (with slash) and https://<website>/dps/entitlement.asmx (no slash) but both are not working.
    Is there a way I can troubleshoot, like seeing what code from the dps app is consuming the web service to know if it really can connect to the service or see what error might be there?
    Thanks!

    Thanks for all replies. I was able to figure out what's causing the error using Charles proxy tool. Looking at the errors, pointed me to the issue of the integrator version. I just created the service having the specifications mentioned in the latest version of the integrator. The latest integrator details can be found here (direct-entitlement-api.pdf) - http://www.adobe.com/devnet/digitalpublishingsuite/articles/dps-entitlement.html
    This issue is closed.

  • Inserting the Brio Query Data into EIS OLAP Model  tables directly

    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of file system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas Jain

    user12072290 wrote:
    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of <A class=bodylinkwhite href="http://www.software-to-convert.com/h264-conversion-software/h264-to-quicktime-software.html"><FONT face=tahoma,verdana,sans-serif color=#000 size=1>file</FONT></A> system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas JainHave you got the answer? Would you pls post it here? 

Maybe you are looking for

  • Check Duplicate Invoice in FB60 and F-43

    Dear Experts, I want to check duplicate invoice in FB60 and F-43. Normally, system only check if company code, vendor, currency, reference no. and invoice date are match. If I only want to check for company code, vendor, currency, reference no. are m

  • Updated content for multi-level record structure in PLSQL

    Hi All, please help me Need a FUNCTION which would take PERSON_id as the INPUT PARAMETER and RETURN the FULL DETAILS in a multi-level record structure in PLSQL. CREATE TABLE people( name VARCHAR2(5), person_id number INSERT INTO people(name,person_id

  • Delete field value with DTW

    Does anyone know how to delete a value in a field with DTW? For example: I want to use DTW to Update my BPs to delete the the Foreign Name for each BP. Any thoughts?

  • Determining Partners

    Dear Guru's Requirement A For each Ship-to party there will be parent Ship-to Party, in sales order, when ship-to party is entered, parent ship-to party should be copied. and B when ship-to party is changed then parent ship-to party should get change

  • What do 3GS users of summer 2009 contracts do?

    Now that the WWDC 11' is over and there was no mention or hint of an iPhone 4S/5 and only the software updates which includes the 3GS andiPhone 4. What do 3GS users who's contracts including mine are up now-looked like the speculation of a delayed iP