Loading the data in different schemas in different environments

Hi all,
My source technology is File and Target is Oracle.
In Development environment developed the code and successively loaded in target  Oracle DB with schema A.
In UAT environment i have to load the different schema with same oracle DB with schema B.
Imported the code to UAT from development and trying to load the data in the schema B.
I created the physical connection and pointed the same Loagical name which i used in schema A.
I'm getting the error.
could you please any one help on this. and also correct me if i'm doing wrong.
Thanks in advance 

Hi,
can you please paste the error here.
Thanks

Similar Messages

  • Stage tab delimited CSV file and load the data into a different table

    Hi,
    I pretty new to writing PL/SQL packages.
    We are using Application express for our development. We get CSV files which is stored as a BLOB content in a table. I need to write a trigger that would get executed once the user the uploads the file and parse thru the Blob content and upload or stage the data in a different table.
    I would like to see if there is any tutorial or article that could explain the above process with the example or sample code to do the same. Any help in this regard will be highly appreciated.

    Hi,
    This is slightly unusual but at the same time easy to solve. You can read through a blob using the dbms_lob package, which is one of the Oracle supplied packages. This is presumably the bit you are missing, as once you know how you read a lob the rest is programming 101.
    Alternatively, you could write the lob out to a file on the server using another built in package called utl_file. This file can be parsed using an appropriately defined external table. External tables are the easiest way of reading data from flat files, including csv.
    I say unusual because why are you loading a csv file into a blob? A clob is almost understandable but if you can load into a column in a table why not skip this bit and just load the data as it comes in straight into the right table?
    All of what I have described is documented functionality, assuming you are on 9i or greater. But you didn't provide a version so I can't provide a link to the documentation ;)
    HTH
    Chris

  • Large Schema file registered and using folders to load the data (WebDAV)

    I have a very large schema file that I have registered using binary (Schema file > 30000 lines).
    My data has been in MarkLogic and it is organized in a folder structure providing reuse of pieces of the xml.
    It seemed the most logical way to load the data was to create the folder structure in XDB and move the files via WebDAV. We also have extensive XQueries written.
    My question is how do I query the data back our when it is loaded this way. I have read and experimented with resource_view but that is not going to get me what I need. I would like to make use of my xqueries that I have.
    Can i do this, and if so, HOW??

    Sure. Let me lay out a with some more specific information.
    My schema has an overall root level of "Machine Makeup".
    All of these items are defined under it with a lot of element reuse and tons of attribute groups that are used throughout the xml schema. I can do a lot of things but what I cannot do is change the structure of the schema.
    The data is currently in a "folder structure" that more closely resembles the following. I have tried to annotate number of files and keep in mind all of these are working documents and additional "record" (xml files) can be added.
    Composite - contains 12 folders and each of these folders contain xml documents
    compfolder 1
    - compfolder 12
    Most of these folders contain < 200 .xml files (each with id.xml) as the name of the file however one of these directories currently contains 1546 files. They all belong... no real way to split them up further.
    At the same level as composite
    About half of these folders at this level contain > 1000 but less than 3000.
    Like
    PartsUse > 1000 but less than 3000
    transfer of parts 8000 and can continue to grow
    people > 3000 and these would be used in the transfer of parts like "who sold it" Everytime someone new a new one is added etc.
    There are about 12 folders at this level.
    Now, the way the system works is a new person that is not in our list is involved the users get a "new" empty xml document to fill in and we assign it an id and place it in the folder.
    So it would something like this
    Composite
    folder1 - 1000 xml file
    folder 2 - 200 xml files
    folder 12 - < 2000
    Locations
    contains < 2000 xml files
    Activities < 3000 xml files
    PartsTransfer 8000 xml files and growing
    materials < 1000 xml files
    setup < 1000 xml files
    people 3000 xml files and growing
    groupUse < 1000
    and so forth.
    All of the folders contain the links I had previously layed out
    So Activities would have materialLink id = "333" peopleLink ="666"
    And so of and so forth.
    So because my file numbers are > than the optimum about half the time, how would I have 3 separate xmltype tables (I understand mine would be more than 3...)the one schema file that is intertwined with others. Can you show me an example. Would I annotate it in the schema at just those levels. This schema file is so huge I would not want to have to annotate all elements with a table. Can I pick and choose to mimic the folder structure?
    THANKS SO MUCH for your help. I really want to show that Oracle is the way. I can implement it with a simpler structure but not the size of complexity that I need. I haven't been able to find a lot of examples with the new binary registration.
    Edited by: ferrarapayne on Nov 16, 2011 1:00 PM

  • Unable to load the data into HFM

    Hello,
    We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
    Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
    Please help us earliest..
    ** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
    Error:
    Code............-2147217873
    Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
       at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
       at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
    Procedure.......clsHPDataManipulation.fDBLoad
    Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
    Version.........1116
    Identification:
    User............fdmadmin
    Computer Name...EMSHALGADHYFD02
    FINANCIAL MANAGEMENT Connection:
    App Name........
    Cluster Name....
    Domain............
    Connect Status.... Connection Open
    Thanks,'
    Raam

    We are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
    As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
    There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
    As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal  is application name, 1 identifies the Scenario and 2013 the Year).  Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
    We are trying to load same data file with a replace option( it should delete the existing data before loading the data file).

  • How to load the data from flat file ( ex excel ) to Planning area directly

    Hi all ,
    How can i load thedata fro m flat file directly to Planning area .
    PLease help me in this.
    Regards,
    Chandu .

    download one key figure data from planning book ( interactive damand plan) and made some changes and need to upload the data back to same planning book
    But, may I know why you are thinking of downloading, changing and uploading for just changing the figures for a particular key figure. You can do it in the planning book itself.
    However, not all the key-figures can be changed. But, what type of key-figure  you are speaking here? Is it like 'Forecast' for which the value is based on other key-figures, or is like a key-figure where some manual adjustments are to be done--so that it can be manually edited? However,  in both the cases, the data can be changed in the planning book only. In first case, you can change the values of dependant key-figures and in the second case, you can change the key-figures directly.
    And please note that you can change the values of the key-figures only at the detailed level. So, after loading the data in the book, use drill-down option, maintain the data at the detailed level, change the figures, and automatically, this gets reflected at the higher level.
    In case you are unable to change the values, go to the 'Design' mode of the book, right-click your key-figure, under "Selected Rows", uncheck "Output Only" option. In case you are unable to see that option, then you are not authorised to change that. See if you can change the authorisations by going to the "Data View" tab in planning book configuration (/n/sapapo/sdp8b), and change the value of Status to 3.
    Hope your query is answered with different solutions offered by many of the sdn colleagues here.
    Regards,
    Guru Charan.

  • TSV_TNEW_PAGE_ALLOC_FAILED error while loading the DATA using DTP

    Hi,
    While loading the data using DTP for 2  DSO's we are gettig the error
    TSV_TNEW_PAGE_ALLOC_FAILED
    can any one kindly help me out regarding the same.
    Thank You,
    Poornima.

    Hi Soundarya,
    Thanks a lot for the reply. But i found that its running fine in development, where as coming to quality its throwing an error. These happened for Two DSO's. In both the transformations i have identified that the Transformation names are different from Development and Quality..
    There are no routines written for them and no select statements have been used
    Can you please suggest me regarding the same.
    Edited by: Poornima Gayatri on Mar 22, 2010 7:00 AM

  • If the data is not available in R/3 systems(Ex: MM), who will load the data

    Hi All,
    Can anybody tell me that if the data is not available in R/3 systems(Ex: MM), who will load the data into that?
    Need Helpful Answers....
    Regards,
    Kiran Telkar

    Hi kiran,
    The data is generated in the R/3 when the business transactions take place.
    No one loads the data into R/3 as it done with BW. R/3 is a OLTP(online transactional processing) system which aides the day to day transactions of a business & theese transactions are stored in the R/3 in there respective tables in the form of records, so one record is generated for each transaction done.
    For example there will be record generated in different modules when there is an order placed or when a material is delivered against an order.
    Hope this helps,
    regards.

  • Unable to load the data from PSA to INFOCUBE

    Hi BI Experts, good afternoon.
        I am loading 3 years data( Full load ) from R/3 to Infocube.
       So loaded the data by monthwise. So i created 36 info packages.
      Everything is fine. But i got a error in Jan 2005 and Mar 2005. It is the same error in both months. That is Caller 01and caller 02 errors( Means invalid characteristics are there PSA data )
    So i deleted both PSA and Data target Requests and again i loaded the data only to PSA.
      Here i got data in PSA without fail.
      Then i tried to load the data from PSA to Infocube MANUALLY.
    But its not happening.
      One message came this
           SID 60,758 is smaller than the compress SID of cube ZIC_C03; no        request booking.
       Please give me the solution how to solve this problem.
      Thanks & Regards
         Anjali

    Hi Teja,
       Thanks for the good response.
      How can i check whether it is already compressed or not?
      Pls give me the reply.
      Thanks
              Anjali

  • Unable to load the data into Cube Using DTP in the quality system

    Hi,
    I am unable to load the data from PSA to Cube using DTP in the quality system for the first time
    I am getting the error like" Data package processing terminated" and "Source TRCS 2LIS_17_NOTIF is not allowed".
    Please suggest .
    Thanks,
    Satyaprasad

    Hi,
    Some Infoobjects are missing while collecting the transport.
    I collected those objects and transported ,now its working fine.
    Many Thanks to all
    Regards,
    Satyaprasad

  • Regarding  Loading the data from DSO to cube.

    Hello Experts,
    I have DSO which loads data from psa  using 7.0 tranformation (using DTP).  And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by  right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
    Please help me with this.
    i want to load the data in the init request to cube..
    Thanks

    Hi Shanthi,
    Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
    Thanks

  • SQL Interface - Error in Loading the data from SQL data source

    Hello,
    We have been using SQl data source for loading the dimensions and the data for so many years. Even using Essbase 11.1.1.0, it's been quite a while (more than one year). For the past few days,we are getting the below error when trying to load the data.
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021013)
    ODBC Layer Error: [S1000] ==> [[DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]CURSOR IDENTIFIED IN FETCH OR CLOSE STATEMENT
    IS NOT OPEN (DIAG INFO: ).]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021014)
    ODBC Layer Error: Native Error code [4294966795]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1021001)
    Failed to Establish Connection With SQL Database Server. See log for more information
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1003050)
    Data Load Transaction Aborted With Error [7]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}///Info(1013214)
    Clear Active on User [Olapadm] Instance [1]
    Interestingly, after the job fails thru our batch scheduler environment, when I run the same script that's being used in the batch scheduler, the job completes successfully.
    Also, this is first time, I saw this kind of error message.
    Appreciate any help or any suggestions to find a resolution. Thanks,

    Hii Priya,
    The reasons may be the file is open, the format/flatfile structure is not correct, the mapping/transfer structure may not be correct, presence of invalid characters/data inconsistency in the file, etc.
    Check if the flatfile in .CSV format.
    You have to save it in .CSV format for the flatfile loading to work.
    Also check the connection issues between source system and BW or sometimes may be due to inactive update rules.
    Refer
    error 1
    Find out the actual reason and let us know.
    Hope this helps.
    Regards,
    Raghu.

  • Excel 2013 PowerPivot Error - "PowerPivot is unable to load the Data Model"

    I'm attempting to use the PowerPivot add-on in Excel.  When clicking 'Manage Data Model' or attempting to 'Load to data model,' I receive the error 'We couldn't load the Data Model.  This may be because the Data Model in this workbook is damaged.',
    followed by 'PowerPivot is unable to load the Data Model.'  This happens irrespective of data source type.  I have Excel 2013 32-bit, PowerPivot 32-bit, running on Windows 7 OS 64-bit.  I am not running an SSAS instance on my machine. 
    Any suggestions?

    Hi
    Have you ever tried to repair your office?
    Also we may refer to the following blog :
    http://blogs.technet.com/b/the_microsoft_excel_support_team_blog/archive/2013/11/12/powerpivot-for-excel-2013-errors-after-october-update-kb-2825655.aspx
    This issue seems to be caused by the October 2013 update (KB 2825655) for Excel 2013. Try to remove it or restore the system to an earlier point and check the result.
    Regards
    Tylor Wang
    TechNet Community Support

  • How to write a procedure to load the data into a table using xml file as input to the procedure?

    Hi,
    Iam new to the xml,
    can u please anyone help me how to write procedure to load the data into a table using xml as input parameter to a procedure and xml file is as shown below which is input to me.
    <?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>.
    Regards,
    vikram.

    here is the your XML parse in 11g :
    select *
      from xmltable('//Entity' passing xmltype
    '<?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>
    ') columns
      "dcode" varchar2(4000) path '/Entity/dcode',
      "ddesc" varchar2(4000) path '/Entity/ddesc',
      "reauthflag" varchar2(4000) path '/Entity/reauthflag'
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    Using this parser you can create procedure as
    SQL> create or replace procedure myXMLParse(x clob) as
      2  begin
      3    insert into MyXmlTable
      4      select *
      5        from xmltable('//Entity' passing xmltype(x) columns "dcode"
      6                      varchar2(4000) path '/Entity/dcode',
      7                      "ddesc" varchar2(4000) path '/Entity/ddesc',
      8                      "reauthflag" varchar2(4000) path '/Entity/reauthflag');
      9    commit;
    10  end;
    11 
    12  /
    Procedure created
    SQL>
    SQL>
    SQL> exec myXMLParse('<?xml version="1.0"?><DiseaseCodes><Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity></DiseaseCodes>');
    PL/SQL procedure successfully completed
    SQL> select * from MYXMLTABLE;
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    SQL>
    Ramin Hashimzade

  • Loading the data from a packed decimal format file using a sql*loader.

    Hi ,
    In one of the project i'm working here i have to load the data into oracle table from a file using a Sql*loader but the problem is the data file is in the packed decimal format so please let me know if there is any way to do this....I search a lot regarding this ..If anybody faced such type of problem ,then let me the steps to solve this.
    Thanks in advance ,
    Narasingarao.

    declare
    f utl_file.file_type;
    s1 varchar2(200);
    s2 varchar2(200);
    s3 varchar2(200);
    c number := 0;
    begin
    f := utl_file.fopen('TRY','sample1.txt','R');
    utl_file.get_line(f,s1);
    utl_file.get_line(f,s2);
    utl_file.get_line(f,s3);
    insert into sampletable (a,b,c) values (s1,s2,s3);
    c := c + 1;
    utl_file.fclose(f);
    exception
    when NO_DATA_FOUND then
    if utl_file.is_open(f) then utl_file.fclose(f); ens if;
    dbms_output.put_line('No. of rows inserted : ' || c);
    end;SY.

  • Load the data from a transparent customised table in BW to an ODS

    Hi there,
    How can you load the data from a transparent customised table created in BW to an ODS.
    Many thanks.
    Sarah

    Hi,
    just create a generic datasource in rso2 for that table, replicate the datasource for source system 'myself', create a infosource and assign the new datasource to that infosource. Create a update rule from infosource to ods, create a infopackage and load the data.
    regards
    Siggi

Maybe you are looking for

  • Issues opening CC programs

    I have just changed users on my MacbookPro (I had to use a corporate user account they assigned me) and now I'm having consistent issues opening Illustrator, InDesign and especially Photoshop. What happens is that when I go to open say Photoshop, I w

  • How to show time values in a chart item of WAD 7.0?

    Hi experts, i have a keyfigure in time format hh:mm:ss. In the analyze item it is no problem to show the query results in this format. But in a chart for example line type the chart designer cannot resolve the values and all lines are down. In the ch

  • Serial no issue

    Hi all, Where we are assigning no range for serial nos?? My reqt is Same serial no should not be repeated for other matl ?? how to fix this ?? where we can see serial nos of both finished and child component together?/ Pl reply guru

  • Connecting Multiple Omega PX209 (Pressure Transducers) to NI 9205

    I am trying to connect 4 Omega PX209-030G5V pressure transducers to my NI 9205 module and read data through the DAQ Assistant VI.  I cannot figure out how to wire the transducer correctly.  The specs sheet for the transducers is at http://www.omega.c

  • JSP: Jasper Error.....NullException

    Hey guys, im trying to retrieve a database listing of items and then build a combo box from that. But I get a error when trying to load the data from the bean. Error: java.lang.NullPointerException      org.apache.jsp.administrator.testingdelete_jsp.