Data mismatch on OIM database

Hi *,
I import some newly created resource to our live system...It returns an error and import terminated...
Later found that, our OIM database reached its maximum allowed size....
But, now there are data mismatches.....
UD_XXX table is created on OIM database. But, it is not showing in the Form Designer form on OIM Design Console....
Then, I created new form, "UD_YYY" on Form Designer in OIM Design console....But, when I logon to database and see, "UD_YYY" table is not created on OIM DB...But, it is showing on the Design console...
I run the following script to increase the db max limit....
ALTER DATABASE DATAFILE '<path>XELTBS_01.DBF' AUTOEXTEND ON NEXT 50M MAXSIZE 5120M;
Now, How can I synchronize the OIM tables.....???
Is it ok, if I manually delete those tables?? and in what table is this custom table info (i.e. UD_*) is kept?? Is it ok, if i delete these entry manually??
Help....
Regards,
Chaturanga

Hi suren,
Yes. I can do it...But, is it ok to keep these miss match data in OIM??
And also in the log, following error occurs countinuesly...
ERROR [XELLERATE.DATABASE] Class / Method: tcDataBase / write statement sometime encounter problems: ORA-01400: can not insert NULL into ("XLADM". "UPA_RESOURCE". "OBJ_KEY")
java.sql.SQLException: ORA-01400: can not insert NULL into ("XLADM". "UPA_RESOURCE". "OBJ_KEY")
I thought, this is due to the data mismatch problem....
Regards,
Chaturanga

Similar Messages

  • OSI / OTI table data mismatch, OTI being subset of OSI

    Hi,
    We have a custom application where we fetch count and data for all provisioning records assigned to logged in user.
    To fetch details of open provisioning tasks, Oracle recommendation is to use OTI table.
    When we studied current system, we had below findings regarding behavior of OSI and OTI tables -
    1. When request is submitted to OIM, approval process is triggered depending upon approval policy.
    2. this workflow is responsbile for creating approval tasks.
    3. when approval tasks are approved, request in OIM is closed and provisioning task for the system gets created.
    4. OIM database tables which are used to fetch these provisioning tasks are OSI/OTI.
    5. According to Oracle documentation, OTI table is subset of OSI table.
    6. when we checked both these tables, we found that entry in both these tables is not exactly same.
    7. For a particular request, OSI table had OSI_ASSIGN_TYPE as 'Group' and OTI table had OSI_ASSIGN_TYPE as 'User'.
    8. As OTI table has osi_assigned_to_ugp_key value as null, this table cannot be used.
    Iit looks like OTI table does not hold correct data. And OSI table has around 9.6 million records. So when we query OSI table we get correct data but it takes 6min to get the data.
    We are now looking for the issue why there is this mismatch and how can it be solved ?
    Regards,
    Deepika

    Hi Kevin,
    Thanks for these details.
    We would try this query in our environment and see the result.
    But regarding OSI /OTI data mismatch, i understand that OTI contains tasks that are still opened, like Rejected, and Pending. And these tasks are also there in OSI, when we check the record in OSI those tasks are assigned to user while when we see same record those are getting assign type changes.
    Is that intended functionality or is it something wrong here.
    Because of this we cannot get correct data directly from OTI table and have to use OSI(which is hampering performance)
    This is the query that we are using:
    select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
    as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
    inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
    (select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
    and (osi.osi_assign_type = 'Group' and osi.osi_assigned_to_ugp_key in
    (select ugp_key from usg where usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + "))
    UNION ALL
    select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
    as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
    inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
    (select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
    and (osi.osi_assign_type in ('User', 'Default task assignment') and
    osi.osi_assigned_to_usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + ")
    order by oti_create
    Regards,
    Deepika

  • Urgent! OIM database application Connector

    I am using 9.0.4 OIM connector pack. How can I make sure the database application is properly deployed and reconcilation is taking place? And also how do you make sure it is configured properly?
    Iam using SQL Server database.

    Are you performing a trusted reconciliation? (Trying to create OIM users from the recon events) or are you trying to just match the user's to their profile on the target system?
    You need to look at the following places for your reconciliation pieces.
    1. Resource Object - Reconciliation tab. These are the fields that will appear in your processed data. You need to also check the rules tab in the same location. The entity match occurs based on your reconciliation rule (the bottom most item on the menu from the java client). This rule matches a value that comes in from the target system and a value on the user's OIM profile so the entity match can be made. This event triggers the Reconciliation Insert task which creates the resource profile for this user.
    2. Proces Definition - Reconciliation Data Mapping. The values here represent the mapping from the fields noted on your resource object recon tab onto the resource's process form. After you have an entry for the resource available on the user's resource profile, these fields determine your "process match". You must define a key field here so that OIM can determine what to continue matching the user's profile against with any new recon events to update the process form for the user.
    If you are performing a trusted reconciliation, you need to define all these items on the Xellerate User object because you are mapping the target system data to your OIM user profile. You must also provide at a minimum, the following fields: First Name, Last Name, Password, Role, Type, Organization (this needs to be the key in the end, but hopefully the connector converts from the name to the key).
    -Kevin

  • Cisco Historical reports Data Mismatches

    dear all
    i need to ask you about cisco Historical reports, i am using UCCX 8.6 and CUCM 8.6
    i am collecting data of calls presented handled and Abundant from these three reports:
    ccdr Call detail
    csq detail
    call abandont detail
    from all these reports data is different and difference is around 1000 calls of all month, abundant presented and handle all three are showing data mismatched,
    Second question is which report i should use to get data of: presented calls, handled calls abundant calls and per agent abundant calls, please suggest 
    please help me to resolve this issue 

    Kindly read the following for better understanding of their relationship
    a) UCCX Reporting Guide - Data reconciliation between reports
    b) UCCX Report Developer Guide - Interpret Database Records
    Thanks!
    -JT-

  • Data mismatch between 10g and 11g.

    Hi
    We recently upgraded OBIEE to 11.1.1.6.0 from 10.1.3.4.0. While testing we found data mismatch between 10g and 11g in-case of few reports which are including a front end calculated column with division included in it, say for example ("- Paycheck"."Earnings" / COUNT(DISTINCT "- Pay"."Check Date")) / 25.
    The data is matching for the below scenarios.
    1) When the column is removed from both 10g and 11g.
    2) When the aggregation rule is set to either "Sum or Count" in both 10g and 11g.
    It would be very helpful and greatly appreciated if any workaround/pointers to solve this issue is provided.
    Thanks

    jfedynic wrote:
    The 10g and 11.1.0.7 Databases are currently set to AL32UTF8.
    In each database there is a VARCHAR2 field used to store data, but not specifically AL32UTF8 data but encrypted data.
    Using the 10g Client to connect to either the 10g database or 11g database it works fine.
    Using the 11.1.0.7 Client to go against either the 10g or 11g database and it produces the error: ORA-29275: partial multibyte character
    What has changed?
    Was it considered a Bug in 10g because it allowed this behavior and now 11g is operating correctly?
    29275, 00000, "partial multibyte character"
    // *Cause:  The requested read operation could not complete because a partial
    //          multibyte character was found at the end of the input.
    // *Action: Ensure that the complete multibyte character is sent from the
    //          remote server and retry the operation. Or read the partial
    //          multibyte character as RAW.It appears to me a bug got fixed.

  • Data mismatch in Test and Prod environments

    Hi,
    we have a query in Test and Prod environments.this query is not giving same result both for Test and production. Please have a look and share your thoughts.
    Select D1.C3,D1.C21,D2.C3,D2.C21
    from
    (select
    sum(F.X_SALES_DEDUCTION_ALLOC_AMT)as C3,
    O.Customer_num as C21
    from
    ESA_W_ORG_D O,
    ESA_W_DAY_D D ,
    ESA_W_SALES_INVOICE_LINE_F F
    where
    O.ROW_WID = F.CUSTOMER_WID
    and D.ROW_WID = F.INVOICED_ON_DT_WID
    and D.PER_NAME_FSCL_MNTH = '2012 / 12'
    group by O.Customer_num)D1,
    (select
    sum(F.X_SALES_DEDUCTION_ALLOC_AMT)AS c3,
    O.Customer_num as C21
    from
    Sa.W_ORG_D@STPRD O,
    Sa.W_DAY_D@STPRD D ,
    Sa.W_SALES_INVOICE_LINE_F@STPRD F
    where
    O.ROW_WID = F.CUSTOMER_WID
    and D.ROW_WID = F.INVOICED_ON_DT_WID
    and D.PER_NAME_FSCL_MNTH = '2012 / 12'
    group by O.Customer_num)D2
    where
    D1.C21=D2.C21
    and D1.C3<>D2.C3;I have done the following steps:
    1. created one temporary table and searched for duplicate records because if any duplicates found am planning to delete those records. but didn't find any duplicates. searched for common column values using equi join condition. are there any possibilities in data mismatch apart from this?
    2. this query is taking around 45 minutes to retrieve the output. I want to increase the performance of the query. so created Unique on 5 columns. but still taking the same time.
    so uing ALL_ROW HINT and ran the query but still it's taking the same time.
    so suggest me any thing needs to add to increase the performance?
    appreciate your support.
    Thanks.

    If you can create a temporary database link between the two environments use DBMS_RECTIFIER_DIFF or DBMS_COMPARISON to compare the two tables' contents.
    http://www.morganslibrary.org/reference/pkgs/dbms_comparison.html
    http://www.morganslibrary.org/reference/pkgs/dbms_rectifier_diff.html

  • "OIM Database Application Connector" is Recon the same user many times!

    Hey,
    I am facing an interesting issue that my OIM Database Application Connector is reconning (Creating) the same user many times.
    I have created/configured the "OIM DB Application Connector" which should Recon the new user in to OIM when ever new user created in the database via portal. I scheduled the connector every 15 min. The connector is working as expected and creating new OIM user if any new user has created in the DB table.
    Issue here is: I have created one user in the table which has reconed to OIM and I can see the entry in design console Recon Manager table. After 15 min when connecotr is run it is picking up the same user. So it is picking the same user every time it runs!!!. It stops picking the user after some time, but dont have exact time when it stop picking the user. But I could see min 25+ times same user and all the time the status is EVENT LINKED. Any idea please why it is happening. My matching criteria is Time Stamp Attribute: "Updated_By_Sysate" & Unique Attribute = "User_ID".
    My Env:
    OIM Version: 9101
    Server : Weblogic
    DB : SQL Server 2005 (Source DB)
    Any idea please?
    thanks
    kln

    1) Yes, you should add in your resource object all fields that are defined in xel_data_source parameters of config.xml file.
    2) Correct. You have to create a user defined field in your form designer and map it to a column in your process definition (reconciliation field mapping tab).
    3) Reconciliation rule is the rule that OIM use to link Database users and OIM users. You need to create a recon rule using an attribute who has the same value in both systems. Also, you need to define this rule in your config.xml file (see how to configure reconcile tasks in connector documentation). So, this attribute used in your recon rule must be required because it will be used to create or link users. You can define any other fields as required, but if one of these required fields are not filled, you will receive a "Required Data Missing" error in your reconciliation manager event.
    4) Reconciliation is used to update OIM with changes made directly in your database table. To update your database table based on OIM changes, you must modify an user attribute in your UD_DBAPP user's form.
    Regards.

  • Issue in creation of group in oim database through sql query.

    hi guys,
    i am trying to create a group in oim database through sql query:
    insert into ugp(ugp_key,ugp_name,ugp_create,ugp_update,ugp_createby,ugp_updateby,)values(786,'dbrole','09-jul-12','09-jul-12',1,1);
    it is inserting the group in ugp table but it is not showing in admin console.
    After that i also tried with this query:
    insert into gpp(ugp_key,gpp_ugp_key,gpp_write,gpp_delete,gpp_create,gpp_createby,gpp_update,gpp_updateby)values(786,1,1,1,'09-jul-12',1,'09-jul-12',1);
    After that i tried with this query.but still no use.
    and i also tried to assign a user to the group through query:
    insert into usg(ugp_key,usr_key,usg_priority,usg_create,usg_update,usg_createby,usg_updateby)values(4,81,1,'09-jul-12','09-jul-12',1,1);
    But still the same problem.it is inserting in db.but not listing in admin console.
    thanks,
    hanuman.

    Hanuman Thota wrote:
    hi vladimir,
    i didn't find this 'ugp_seq'.is this a table or column?where is it?
    It is a sequence.
    See here for details on oracle sequences:
    http://www.techonthenet.com/oracle/sequences.php
    Most of the OIM database schema is created with the following script, located in the RCU distribution:
    $RCU_HOME/rcu/integration/oim/sql/xell.sql
    there you'll find plenty of sequence creation directives like:
    create sequence UGP_SEQ
    increment by 1
    start with 1
    cache 20
    to create a sequence, and
    INSERT INTO UGP (UGP_KEY, UGP_NAME, UGP_UPDATEBY, UGP_UPDATE, UGP_CREATEBY, UGP_CREATE,UGP_ROWVER, UGP_DATA_LEVEL, UGP_ROLE_CATEGORY_KEY, UGP_ROLE_OWNER_KEY, UGP_DISPLAY_NAME, UGP_ROLENAME, UGP_DESCRIPTION, UGP_NAMESPACE)
    VALUES (ugp_seq.nextval,'SYSTEM ADMINISTRATORS', sysadmUsrKey , SYSDATE,sysadmUsrKey , SYSDATE, hextoraw('0000000000000000'), 1, roleCategoryKey, sysadmUsrKey, 'SYSTEM ADMINISTRATORS', 'SYSTEM ADMINISTRATORS', 'System Administrator role for OIM', 'Default');
    as a sequence usage example.
    Regards,
    Vladimir

  • Update/insert/delete data from xcelsius to Database via web service

    Hi,
    I need to create dashboard that go function can <b>update/insert/delete</b> data send to <u>Database</u> thru <u>web services</u>, as i know got 2 xcelsius add-on software which support those of function <b>InfoBurst</b> and <b>flynet </b>
    <b>InfoBurst</b>
    http://www.infosol.com/azbocug/minutes/4-Writeback%20to%20a%20Database%20with%20Xcelsius.pdf
    <b>flynet </b>
    http://www.flynetviewer.com/public/community/Blogs/FlynetXcelsiusServerUser/default.aspx
    Except this 2 purchase add-on xcelsius, any other solution ?  
    Maybe need to write some in MSSQL or C# programming which enable insert, update, delete ...etc  ?
    *note: i not use Xcelsius Engage Server , i use Xcelsius Engage only
    thanks,
    regards
    s1
    Edited by: Leong Pui Kee on Mar 1, 2011 6:06 AM

    Hi,
    As of now in Xcelsius/Dashboard Design there is no feature or functionality to insert/update/delete data from database.
    Solution:
    Create a Web service in let’s say C# or Java, which will perform insert/update/delete operation.
    In Xcelsius add Web Service connection and user above web service.
    Xcelsius Web Service connection provides option to pass input values to a Web Service (Input Pane) and get the result (Output values pane).
    We can pass values to be written to the database as a input to Web Service via Web Service connection from Xcelsius and write data to the database.
    Note:
    Performing delete operation from Xcelsius Dashboard could be risky and may delete important data from database. I would not prefer giving delete option/functionality in Xcelsius dashboard.
    Hope this helps!
    Thank you.
    Regards,
    Vinay Mhaske

  • ORA-28150 when accessing data from a remote database

    Portal Version: Portal 3.0.9.8.0
    RDBMS Version: 8.1.6.3.0
    OS/Vers. Where Portal is Installed:: Solaris 2.6
    Error Number(s):: ORA-28150
    I have a problem with using a database link to access a table in
    a remote database. So long as the dblink uses explicit logins
    then everything works correctly. When the dblink does not have a
    username then I get the ORA-28150 message. The database link is
    always public. A synonym is created locally that points to a
    synonym in the remote database. I am using the same Oracle user
    in both databases. The Oracle portal lightweight user has this
    same Oracle user as its default schema. The contents of the
    remote table are always visible to sqlplus, both when the link
    has a username and when it doesn't have a username.
    All the databases involved are on the same version of Oracle.
    I'm not sure which Oracle login is being used to access the
    remote database, if my lightweight user has a database schema
    of 'xyz' then does portal use 'xyz' to access the remote
    database? I would be very grateful for any help or pointers that
    might help to solve this problem.
    James
    To further clarify this, both my local and remote databases
    schemas are owned by the same login.
    The remote table has a public synonym.
    The link is public but uses default rather than explicit logins.
    The local table has a public synonym that points to the remote
    synonym via the database link.
    If I change the link to have an explicit login then everything
    works correctly.
    I can view the data in the remote database with TOAD and with
    sqlplus even when the database link has default login rather
    than explicit login.
    This seems to point to Portal as being the culprit. Can anyone
    tell me whether default logins can be used across database links
    with portal?
    TIA
    James

    832019 wrote:
    One way to do this is by creating a database link and joining the two tables directly. But this option is ruled out. So please suggest me some way of doing this.Thus you MUST use two connection strings.
    Thus you are going to be either constructing some intricate SQL dynamically or you are going to be dragging a lot of data over the wire and doing an in memory search.
    Although realistically that is what the database link table would have done as well.
    Might be better to look at moving the table data from one database to the other. Depends on size of course.

  • Move XMLTYPE data to a remote  database

    Hello,
    I need your experience !!!!!
    I am working with the 10.2.1 database.
    I have try to call a remote procedure (using DBLINK) with a XMLTYPE parameter, I get a error link to a lob locator.
    One solution could be to save XMLTYPE to disk.
    Some body know other solution to transfer XMLTYPE data to a remote database.
    Thanks in advance
    Best Regards

    A future alternative will be Oracle streams and b.t.w. one of the advantages (sometimes not, but anyway) of an Oracle database is that you can use object orientated methods, relational methods, java solutions, etc. and nowadays XML methods. XML was once all about transporting data, creating an uniform interface between solutions (databases, application/webservers, technology stacks, etc...). Think out of the Box. XMLDB isn't relational, but it is packaged in one. Make use of it and vice versa.

  • APEX Application accessing data from two different databases

    Hi All,
    Currently as we all know that APEX Application resides in database and is connected to the schema of that database.
    I want APEX Application to be running and accessing data from two different databases. Elaborating my question,
    Currently, my APEX Production Application is connected with XXXX Schema of DB1 Database(Where APEX Resides). Now I want to add some pages into this APEX Application for REPORT Purpose, But I want to connect this REPORT APEX Pages to get data from Different Schema YYYY for Database DB2.
    Is it possible to configure this scenario?
    The reason for doing this is to avoid the REPORT related (adhoc queries) resource utilization effect on Production DB1 Database.
    Thanks
    Nil

    1. If you do the joining of two or more tables in DB1 then all data is pulled over to DB1 and then the join is executed: so more data over the databaselink and more work for DB1. Better keep the joining stuff where the data resides and just pull exactly that data over that you need.
    2. Don't know about your different block sizes. Seems a nice question for one of the other forums (DBA or SQL).
    3. I mean create synonyms on DB1 for reports VIEWS in DB2.
    Hope all is clear!

  • Data Mismatching in DSO

    Hi Folks,
    We  have data mismatch issue for one info object ( BASE UOM ) when the data goes from staging DSO to conformance DSO.I will explain the issue here.
    For one invoice no the Base UOM value dispayed as M2 in the staging DSO.When the data goes from staging DSO to conformance DSO,in the conformance DSO the Base UOM dispayed as Blank value.There is no code written for calculation  of Base UOM in transformation level.This is direct a mapping to field Base UOM in the conformance DSO from staging DSO.For all other invoices the Base UOM value dispalyed correctly in conformance DSO when compare to staging DSO. For one specific invoice the base UOM value showing as blank in the conformance DSO.
    Could you please give me your suugestions what the reasons for showing  base UOM value showing as a blank value in Conformance DSO.
    Thaks for your advance.
    Regards,
    Nag

    Hi,
    You will have to check following things,
    1) Check if other records with base unit value as M2 are updated properly in conformance DSO. This will make sure that there is no issue with this particular base unit.
    2) As you have mentioned other records are successfully getting updated, you might have to debug the data load for this particular record. Do the selective data loading and check, where the unit is getting cleared.
    You can also check if there is any conversion routine at the infoobject level.
    Regards,
    Durgesh.

  • Upload data from excel into database through pl/sql

    Hi All,
    I have excel which contains data lets say employee details,
    I have one upload button ,which is used to upload excel and then i want to map the cell of excel to the database column and through plsql code i want to upload the excel data into database.
    In short ,i want to upload the data from excel into database using plsql code,
    or suggest me any other way to do this.(except the data load method present in apex)
    Thanks,
    Jitendra

    if you use APEX 4 you can define you own table
    the code below is for APEX 3
    PROCEDURE pro_carga_planilla_prosp( p_archivo VARCHAR2) IS
    v_blob_data BLOB;
    v_blob_len NUMBER;
    v_position NUMBER;
    v_raw_chunk RAW(10000);
    v_char CHAR(1);
    c_chunk_len number := 1;
    v_line VARCHAR2 (32767) := NULL;
    v_data_array wwv_flow_global.vc_arr2;
    v_rows number;
    v_sr_no number := 1;
    v_ok boolean := true;
    v_local_ok BOOLEAN := TRUE;
    v_reg_ok NUMBER := 0;
    v_reg_ko NUMBER := 0;
    v_localidad_id NUMBER;
    v_departamento_id NUMBER;
    v_cargo_id NUMBER;
    v_prospecto_id NUMBER;
    v_asesor_id NUMBER;
    V_REG prospectos%rowtype;
    BEGIN
    -- Read data from wwv_flow_files</span>
    select blob_content into v_blob_data
    from wwv_flow_files
    where name= p_archivo;
    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;
    -- Read and convert binary to char</span>
    WHILE ( v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    -- pro_log('linea '||v_line);
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved </span>
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span>
    v_line := replace(REPLACE (v_line, ',', ':'), ';',':');
    v_line := replace(replace(v_line, chr(10)),chr(13));
    if substr(v_line,1,1)= ':' then
    v_line := '0'||v_line;
    end if;
    if instr(v_line,':',1,21) = 0 then
    if instr(v_line,':',1,20) = 0 then
    v_line:=v_line||':';
    end if;
    v_line:=v_line||':';
    end if;
    -- pro_log(v_line);
    -- Convert each column separated by : into array of data </span>
    v_data_array := wwv_flow_utilities.string_to_table (v_line);
    -- Insert data into target table </span>
    IF v_data_array(1) IS NOT NULL AND
    v_sr_no <> 1 THEN
    V_REG.NOMBRE:=ltrim(rtrim(v_data_array(2)));
    V_REG.RAZON_SOCIAL:=v_data_array(3);
    V_REG.DIRECCION := v_data_array(4)||' '||v_data_array(5);
    -- PRO_LOG('PROSP 1 ' ||v_sr_no);
    v_localidad_id := pack_empresas.get_localidad(v_data_array(6));
    -- PRO_LOG('PROSP 1.1 '||v_sr_no);
    V_REG.LOCALIDAD_ID:=v_localidad_id;
    -- PRO_LOG('PROSP 1.2 '||v_sr_no);
    V_REG.CODIGO_POSTAL:=LTRIM(RTRIM(v_data_array(7)) );
    -- PRO_LOG('PROSP 1.3 '||v_sr_no);
    -- PRO_LOG('PROSP 1.1 '||v_sr_no);
    v_departamento_id := pack_empresas.get_departamento(v_data_array(8));
    -- PRO_LOG('PROSP 1.4 '||v_sr_no);
    V_REG.DEPARTAMENTO_ID:=v_departamento_id;
    -- PRO_LOG('PROSP 1.5 '||v_sr_no);
    V_REG.TELEFONO:=v_data_array(9);
    --PRO_LOG('PROSP 1.6 '||v_sr_no);
    V_REG.TELEFONO2:=v_data_array(10);
    -- PRO_LOG('PROSP 1.7 '||v_sr_no);
    V_REG.RUBRO:=v_data_array(11);
    -- PRO_LOG('PROSP 1.8 '||v_sr_no);
    V_REG.RUC:=ltrim(rtrim(v_data_array(12)));
    -- PRO_LOG('PROSP 1.9 '||v_sr_no);
    -- pro_log(v_data_array(1));
    -- pro_log(v_data_array(2));
    V_REG.CANTIDAD_EMPLEADOS:=RTRIM(LTRIM(v_data_array(13)));
    -- PRO_LOG('PROSP 1.10 '||v_sr_no);
    -- pro_log(v_data_array(14));
    V_REG.CANTIDAD_BENEFICIARIOS:=RTRIM(LTRIM(v_data_array(14)));
    --PRO_LOG('PROSP 1.11 '||v_sr_no);
    V_REG.MAIL:=v_data_array(19);
    -- pro_log(V_REG.MAIL);
    -- PRO_LOG('PROSP 1.12 '||v_sr_no);
    -- v_data_array(20):= replace(replace(v_data_array(20),chr(10)),chr(13));
    if not v_data_array.exists(20) then
    -- pro_log('existe');
    -- pro_log(ltrim(rtrim(replace(replace(v_data_array(20),chr(10)),chr(13)))));
    V_REG.Proveedor:= ltrim(rtrim(replace(replace(v_data_array(20),chr(10)),chr(13))));
    else
    v_data_array(20):=null;
    end if;
    -- V_REG.PROVEEDOR:=v_data_array(20);
    -- PRO_LOG('PROSP 1.13 '||v_sr_no);
    if not v_data_array.exists(21) then
    V_REG.OBSERVACIONES:=v_data_array(21);
    else
    v_data_array(21):=null;
    end if;
    -- PRO_LOG('PROSP 1.14 '||v_sr_no);
    -- PRO_LOG('PROSP 1.2 '||v_sr_no);
    insert into prospectos (nombre,razon_social, direccion,localidad_id,codigo_postal,
    departamento_id, telefono, telefono2, rubro,ruc,cantidad_empleados,
    cantidad_beneficiarios,mail,proveedor,observaciones)
    values (nvl(ltrim(rtrim(v_data_array(2))),v_data_array(3)), v_data_array(3),
    v_data_array(4)||' '||v_data_array(5),
    v_localidad_id, LTRIM(RTRIM(v_data_array(7))),v_departamento_id, v_data_array(9),
    v_data_array(10),v_data_array(11), ltrim(rtrim(v_data_array(12))), RTRIM(LTRIM(v_data_array(13))),
    RTRIM(LTRIM(v_data_array(14))),v_data_array(19),v_data_array(20), v_data_array(21))
    returning prospecto_id INTO v_prospecto_id;
    -- PRO_LOG('PROSP 2');
    v_cargo_id := pack_empresas.get_cargo(v_data_array(17));
    -- PRO_LOG('PROSP 3');
    insert into prospecto_contactos (prospecto_id,nombre,apellido,cargo_id,
    telefono,mail)
    values (v_prospecto_id, nvl(v_data_array(15),'S/N'), nvl(v_data_array(16),'S/A'),
    v_cargo_id, v_data_array(18), v_data_array(19));
    -- PRO_LOG('PROSP 4');
    v_asesor_id := pack_empresas.get_asesor(v_data_array(1));
    -- PRO_LOG('PROSP 5');
    insert into asignaciones (prospecto_id,asesor_id,fecha_asignacion)
    values (v_prospecto_id, v_asesor_id, trunc(sysdate));
    -- PRO_LOG('PROSP 6');
    END IF;
    -- Clear out
    v_line := NULL;
    v_sr_no := v_sr_no + 1;
    END IF;
    END LOOP;
    delete wwv_flow_files
    where name= p_archivo;
    END pro_carga_planilla_prosp;
    function hex_to_decimal
    --this function is based on one by Connor McDonald
    --http://www.jlcomp.demon.co.uk/faq/base_convert.html
    ( p_hex_str in varchar2 ) return number
    is
    v_dec number;
    v_hex varchar2(16) := '0123456789ABCDEF';
    begin
    v_dec := 0;
    for indx in 1 .. length(p_hex_str)
    loop
    v_dec := v_dec * 16 + instr(v_hex,upper(substr(p_hex_str,indx,1)))-1;
    end loop;
    return v_dec;
    end hex_to_decimal;

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

Maybe you are looking for

  • Memory upgrade on new macbook pro without Retina display?

    Hello, I was wondering if the memory on the new macbook pro's without the retina display be upgraded still or is it soldered on? Thanks,

  • Wireless Network/printing Newbie Questions....

    Okay, first of all I love Airport and OSX Tiger on my new MacBook. I recieve a strong wireless internet connection in my Los Angeles apartment, and I have NO idea where it's coming from! (my neighbors? The police station across the street,? Just in t

  • Question about macbook air connecting to the tv

    Hi there, I have a macbook air and a non-HDMI tv I recently procured from Goodwill. What do I need to buy in terms of cords, etc... to connect my Mac to my tv? Thanks!

  • NEW: V04 Error Code Support

    If you get the V04 error message on BT Vision it means you may have a problem with your On Demand service. Slow broadband speed is usually the main cause. Updated V04 error message support has now been launched to help with these problems, you can ac

  • Restrict a t.code VK11 using Site as authorization object

    Hi all, We want to restrict VK11 t.code using Site as one of the authorizations. By default it has only Sales Org, Distr channel and division. I've added one more field for "Site" manually. We have defined specific values for Site in authorization ob