........ Expert suggestion needed for Loading Data in DB
Hi all
The current requirement we have is to Load data from Excel File to oracle Databases.
The Excel file has around 30 Sheets , each corresponds to a table, ( means 30 tables in the database) .
Currently we are using sqlldr commands to load data from CSV files to the Oracle Database. The only problem is that sometime the DBA has to go out or is busy workign on somthing else, so May i kindy get some expert suggestions on how to automate this DataLoading Task.
Somebody has suggested to My Manager ( who is not aware of oracle and IT at all) that data can be loaded via ODBC. It is suggested to him that all we need to do is to place the CSV files on the Server at a particular folder and Oracle will do the rest.
I am not that proficient in Data loading methods.. so may i know any technique which will simplify/automate the Task.
I mean how can we automate so that every time , the the sql loader scripts run at command promt. I think data base ( Oracle ) as nothing to do with command promt i feel. isnt it ..
Kindly have ur expert./experienced suggestions.
I would be highly grateful to all
Regards
To automate sqlldr scripts, you would usually write a OS script file that will periodically looks for files in a given directory and run the sqlldr commands:
- on Windows: .bat or .wsh files to be scheduled with "at" command or windows scheduled tasks tool
- on Unix: a script shell in the crontab.
Much more complicate but without any OS script:
- write a PL/SQL code to read, parse the file to be loaded using UTL_FILE
package
- this PL/SQL code must also generate INSERT statements and process errors ...
- this PL/SQL code can be scheduled with DBMS_JOB package to run a periodic intervals.
Message was edited by:
Pierre Forstmann
Similar Messages
-
ABAP routine for loading data of current month in InfoPackage
Hello experts,
I want to load data of current month. Therefore I implemented the following ABAP-Routine for field 0CALDAY in the InfoPackage's data selection:
data: lv_datum_l like sy-datum,
lv_datum_h like sy-datum.
concatenate sy-datum(6) '01' into lv_datum_l.
call function 'RP_LAST_DAY_OF_MONTHS'
exporting
day_in = sy-datum
importing
last_day_of_month = lv_datum_h.
l_t_range-sign = 'I'.
l_t_range-option = 'BT'.
l_t_range-low = lv_datum_l.
l_t_range-high = lv_datum_h.
But function 'RP_LAST_DAY_OF_MONTHS' cannot be found!
Any other suggestions for getting last day of current month?
regards
hizaHi,
Add this code. When u complete execution of this code, ZDATE will have ur month end date.
data : zdate_c(2) type c,
zyear_c(4) type c,
zmonth_c(2) type c.
data : zdate like sy-datum.
data : zday(2) type n.
zdate = sy-datum
zyear_c = zdate(4).
zmonth_c = zdate+4(2).
zdate_c = '01'.
concatenate zyear_c zmonth_c zdate_c into zdate.
CALL FUNCTION 'FIMA_END_OF_MONTH_DETERMINE'
EXPORTING
I_DATE = zdate
IMPORTING
E_DAYS_OF_MONTH = NO_OF_DAYS.
zday = NO_OF_DAYS.
concatenate zyear_c zmonth_c zday into zdate. -
Hi,
We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
Also its only happening to one ledger rest all ledgers are working fine without any issues.
ThanksHi,
there are some Support documents related to this issue.
I would suggest you have a look to them.
Regards -
Sql loader Need to load data with "," only in one filed
Hi,
I need to load data my in one column my data is in CSV format like this
Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
i want to upload this data on my table which contain only one column which is name ?
What will be the query for upload data through sql loader
Thanks
Shahzaib ismail
Oracle database Express Edition Developer 6ISince you mention you're using database version XE, I'll assume you're database version is at least 10.2
SQL> select * from v$version;
BANNER
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Productand so you have the power of:
- external tables
http://www.oracle-base.com/articles/9i/ExternalTables9i.php
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6611962171229
- regular expressions
http://nuijten.blogspot.com/2009/07/splitting-comma-delimited-string-regexp.html
and you don't want to be using SQL*Loader anymore, never ever.
I simply put your string 'Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan' in a file called test.csv and told Oracle that file is in my Oracle directory DATA_DIR (that actually points to: c:\data on my 'filesystem' ) and then:
SQL> create table t(name varchar2(155));
Table created.
SQL> -- instead of SQL*Loader use an External Table:
SQL> create table ext_t
2 ( textstring varchar2(4000)
3 )
4 organization external ( type oracle_loader
5 default directory DATA_DIR
6 access parameters (fields terminated by '' )
7 location ('test.csv')
8 );
Table created.
SQL> -- Now you can query your file as if it were a table!
SQL> select * from ext_t;
TEXTSTRING
Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
1 row selected.
SQL> -- and use the powers of SQL to do whatever you want (instead of cludging with those dreaded ctl files):
SQL> select regexp_substr (textstring, '[^,]+', 1, rownum) names
2 from ext_t
3 connect by level <= length(regexp_replace(textstring, '[^,]+'))+1;
NAMES
Shahzaib ismail
Imran aziz
Shahmir mehmood
Shahzad khan
4 rows selected.
SQL> -- Voilà, the data is loaded into the table in one single SQL statement:
SQL> insert into t
2 select trim(names)
3 from ( select regexp_substr (textstring, '[^,]+', 1, rownum) names
4 from ext_t
5 connect by level <= length(regexp_replace(textstring, '[^,]+'))+1
6 );
4 rows created.
SQL> --
SQL> select * from t;
NAME
Shahzaib ismail
Imran aziz
Shahmir mehmood
Shahzad khan
4 rows selected.Don't use SQL*Loader, use an External Table. -
Dear Friends ,
We have already maintained the flow from 0CO_OM_CCA_9 to Infocube 0COOM_C02 . Now i enhanced two fields in the datasource which i want to load into the cube . The infocube already contains data from 2008 onwards . Now i know that for controlling datasources SAP ECC/R3 downtime is not needed . Requirement is to load the historical data again for the two enhanced fields also from 2008 onwards .
Kindly let me know the steps for doing as the delta mechanism is already activated for this flow ?
Thanks & Regards ,
Ashutosh SinghHi All,
I am able to create this in client 100 but not in 120. I created this program by copying the source code from sap note and run in test mode in client 100 of development server. I am getting the message that "there is no inconsistency in table BWMO2_TIMEST".
1. I am using client for loading data 120 not 100.
Please let me know the further steps.
Thanks,
Harry -
Need to Load data in XML file into multiple Oracle tables
Experts
I want to load data that is stored at XML file into multiple Oracle table using plsql.
I am novice in this area.Kindly explain in depth or step by step so that i can achive this.
Thanks in advnace.Hi,
extract your xml and then you can use insert all clause.
here's very small example on 10.2.0.1.0
SQL> create table table1(id number,val varchar2(10));
Table created.
SQL> create table table2(id number,val varchar2(10));
Table created.
SQL> insert all
2 into table1 values(id,val)
3 into table2 values(id2,val2)
4 select extractValue(x.col,'/a/id1') id
5 ,extractValue(x.col,'/a/value') val
6 ,extractValue(x.col,'/a/value2') val2
7 ,extractValue(x.col,'/a/id2') id2
8 from (select xmltype('<a><id1>1</id1><value>a</value><id2>2</id2><value2>b</value2></a>') col from dual) x;
2 rows created.
SQL> select * from table1;
ID VAL
1 a
SQL> select * from table2;
ID VAL
2 b Ants -
Process chains for loading data to target is not functioning
Hi SAPians,
Recently, we have upgraded the firmware on IBM P590 with the help of IBM on Saturday i.e. 06/12/2008 (The firmware of P5-590 was SF235_209. It was upgraded to SF240_XXX) and since then the process chains for loading data to targets are not functioning properly. We have stopped all the SAP services, database services & OS services from our end and services have been rebooted after firmware upgrade.
However, the problem with the process chains that load transaction data and hierarchies is solved but the chains that load master data are still not working as scheduled.
We tried to load the master data manually, by running DTP to load data to an object analysis code (attributes) the request remained YELLOW. After refreshing it several times, the request turned into RED. The error message was - PROCESSING TERMINATED (screenshot 1 attached).
To mitigate this we tried deleting the bad request and it prompted with the below message:
"Cannot lock request DTPR_4C37TCZGDUDX1PXL38LVD9CLJ of table ZANLYSIS for deletion (Message no. RSMPC178)" (screenshot 2 attached)
Please advise us how the bad request should be deleted?
Regards,
SoujanyaHi Sowjanya,
Follow the below procedure to make yellow request to RED:
Use SE37, to execute the function module RSBM_GUI_CHANGE_USTATE
From the next screen, for I_REQUID enter that request ID and execute.
From the next screen, select 'Status Erroneous' radiobutton and continue.
This Function Module, change the status of request from Green / Yellow to RED.
Once it is RED, you can manually delete that request...
Releasing LocK
Gott Transaction Code SM12 and release the lock.
Hope It helps you.
Regardss,
Nagaraju.V -
Steps for loading data into the infocube in BI7, with dso in between
Dear All,
I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
Top to bottom:
InfoCube (Customized)
Transformation
DSO (Customized)
Transformation
DataSource (Customized).
The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
Kindly advise me, where did i miss.
OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
Regards,Hi,
my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
Hope this helps.
Kind regards,
Jürgen -
Need to load data from source .CSV files to oracle target database.
Hi,
This is the my scenario
I have .CSV files in ftp folder and need to load the data into target tables.
For that i need to create package and load the data into daily basis.
But some time .csv file name will vary daily basis.
can you any one suggest me???
Thanks in Advacne.
ZakeerDear Roy,
Thanks for your response
Now I am able to extract the .zip file OdiUnZip (file). and loading data into target this is chapping in static way
and my scenario is that some time i will get .zip files with different names with different .csv files
i need to dynamically find the new .zip file and extract it and load the data into target.
Please advice me..
Thanks in advance
Zakeer -
Need help loading data from Excel data file to oracle tables
i need to load an Excel spreadsheet to Oracle table.
The spreadsheet contains 20 columns, the first 8 columns contains basic info, but the rest 12 columns contains info like
1stQtr_08, 2ndQtr_08, 3rdQtr_08, 4thQtr_08
1stQtr_09, 2ndQtr_09, 3rdQtr_09, 4thQtr_09
1stQtr_10, 2ndQtr_10, 3rdQtr_10, 4thQtr_10
So what i need to accomplish is:
break that one record(with 20 fields) in Excel to 3 records for each fiscal year in the Oracle table, so each record in the database table will look like
8 basic field + fiscal_year + 1stQtr_08, 2ndQtr_08, 3rdQtr_08, 4thQtr_08
8 basic field + fiscal_year + 1stQtr_09, 2ndQtr_09, 3rdQtr_09, 4thQtr_09
8 basic field + fiscal_year + 1stQtr_10, 2ndQtr_10, 3rdQtr_10, 4thQtr_10
There are about 10000 rows in the data file, so how can i use sqlldr to perform such task? beside sqlldr, any other good suggestions?
thxExternal Tables is just an Oracle API for sqlloader. If you going to load this data over and over again, External tables would be a good idea, but if it's a one time thing, sqlldir is simpler, unless you just want to learn how to use External Tables.
I used to run a data warehouse, so I have done, at least similar, to what you are doing in the past. -
Urgent Suggestions needed for best way to solve the problm
Hi Everybody,
I am working on an application which has to graphically show the data in the database. We are using JSP for the front end (the view for the time being will be simple, with text boxes and frames and DHTML) and tomcat as the Server. The data is huge and there ain't much business logic involvd (the user when he clicks on a URL I get the data pertaining to that particular table or column). Now my question is can Tomcat handle such huge data. What do u guys suggest, should I cache data at my server and server to the clients (instead of connecting to the DB for each and every request) If yes, can you guys please suggest a good way to cache data, I mean hat TableModel or HashMap will store data from different tables. Also the DB isn't updated for a while everyday. Also let me know if any of u guys still think a 3 -tier approach is advisable instead of the 2-tir approach
Thanks guys who made it this farWell, of course a 3-tier approach is advisable. Also, yes, Tomcat can handle what you are doing. As far as caching data on the server, that is going to depend on your database and other requirements. Now about the graphical nature, you aren't going to be able to use JSP to draw graphics. The only way you can do it is to draw the graphics on the server-side, write out JPG files, but then the JSP won't know anything about where the image is, unless you always write the same number of images, in the same place, every time. Otherwise, you need an Applet. Hope that helps.
-
How to increase the number of processes taken for loading data?
Hi,
While loading data from R/3 to BW, we found that the particular load is taking 2 processes to load data into Cube.
Is there anyway to increase the number to 4 ?
Thanks in advance.
Bobby.Bobby,
to my knowledge we can't change that. Let me explain this, we have setting in the source system for DS default Data Transfer. there we will assign the processes. if you want to assign 4 you need to change the setting in the source system. For flat files we can change in BW System. We can maintain the setting in the Infopackage level(wht we are assigned in the Source System), but we can't change the process.
in order to check the setting in source system SBIW--> General Settings --> Control Parameters for Data Transfer.
we need to change here, this will effect to all the Data Sources. Before making changes check with your basis.
All the best.
Regards,
Nagesh Ganisetti. -
SOS!!!!----Error for Loading data from Oracle 11g to Essbase using ODI
Hi all.
I want to load data from oracle database to essbase using ODI.
I configure successfully the physical and logical Hyperion essbase on Topology Manager, and got the ESSBASE structure of BASIC app DEMO.
The problem is.
1. When I try view data right click on the essbase table,
va.sql.SQLException: Driver must be specified
at com.sunopsis.sql.SnpsConnection.a(SnpsConnection.java)
at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java)
at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java)
at com.sunopsis.graphical.frame.b.jc.bE(jc.java)
at com.sunopsis.graphical.frame.bo.bA(bo.java)
at com.sunopsis.graphical.frame.b.ja.dl(ja.java)
at com.sunopsis.graphical.frame.b.ja.<init>(ja.java)
at com.sunopsis.graphical.frame.b.jc.<init>(jc.java)
I got answer from Oracle Supporter It's ok, just omit it. Then the second problem appear.
2. I create an interface between oracle database and essbase, click the option "staging area deffirent from target"(meaning the staging is created at oracle database), and using IKM SQL to Hyperion Essbase(metadata), execute this interface
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 61, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
at com.hyperion.odi.essbase.ODIEssbaseMetaWriter.validateLoadOptions(Unknown Source)
at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
at org.python.pycode._pyx1.f$0(<string>:61)
at org.python.pycode._pyx1.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java)
at org.python.core.PyCode.call(PyCode.java)
at org.python.core.Py.runCode(Py.java)
at org.python.core.Py.exec(Py.java)
I'm very confused by it. Anybody give me a solution or related docs.
Ethan.Hi ethan.....
U always need a driver to be present inside ur <ODI installation directory>\drivers folder.....for both the target and the source......Because ojdbc14.jar is the driver for oracle already present inside the drivers folder ....we don't have to do anything for it.....But for Essbase ...you need a driver......If u haven't already done this.....try this.....
Hope it helps...
Regards
Susane -
Options for loading data from legacy mainframe to SAP CRM and ECC
Hello All,
I am new to SAP.
As part of data conversion planning from legacy mainframe to SAP CRM and ECC systems, I need to know the different tools available for loading the data and comparative analysis of the tools by showing the features. Please also describe the process to evaluate the tools (prototyping, technical discussions etc) and make the recommendations.
I would also appreciate any information on testing tools like (eCATT) and any third party tools (like Informatica)
I know I am asking for lot of things here. But, I really appreciate any information reagrding this.
Thanks,
BaluHi
Data Migration(Conversions) mainly involves the following Major Steps
Discovery->Extract->Cleanse->Transform->Load
Discovery-> Involves identifying various source systems available where the actual data is residing.
Extract->Extract the data from the source systems(legacy systems)
Cleanse->Cleanse the data by enriching it,De-duplications etc
Transform->Transform the data into SAP Loadable format
Load->Upload the data into SAP Using the Load Programs(Eg,BDC,LSMW,Data Transfer Workbench,BAPIs etc)
Also i would request you to visit http://www.businessobjects.com/solutions/im/data_migration.asp
Cheers,
Hakim -
Taking too long for Loaded data visible in Reporting
Hi experts,
I have a cost center cube, which takes very very long to show the " Loaded data is not yet visible in Reporting " icon for the request. I donot have aggregate on the cube and no other requests are Red /yellow. What is the reason for this weird nature. I checked the consistency check, there are no over lap /errors for the request.
Any help is appreciated.
thanks in adavnce
D BretHi Dave,
Try to run the RSRV for the Cube and try to refresh the data in the cube.
RSRV --> Combined test --> transaction data --> Status of data in the info cube.
If you find any errors after running this check,repair it and then refresh the cube's data.
Hope this helps.
Thanks & Regards,
Suchitra.V
Maybe you are looking for
-
Any way to restore permissions of pacakge files?
I was running out of space in my root partition so I decided to use part of the disk that only had my personal data for /usr. So, I resized and copied over all files from /usr. But, I realized that I had forgot to preserve permissions after I had alr
-
Myth TV - help is needed???
HI AF. I would like to know if somebody have been able to get MythTV to work??? Most of MythTV is working. The only thing missing is the Live TV. And recording?? After trying for several days - I'm at a point where i don't know what to do??? My Hard
-
Can I disable the album cover view when I turn the iTouch sideways?
So when you I am listening to music, and I turn the touch on its side, it switches from the player to the album view that you can scroll back from side to side. I don't know about you guys, but I HATE this. I never EVER use this and I think it super
-
How to make iTunes connect on demand to a file server?
I moved my iTunes to a volume on a ReadyNas box, the volume is iTunes and the box is NAS. The reason I did that instead of using the ReadyNas iTunes server is that it keeps my playlists working while the server approach causes them to fail. So questi
-
Je suis un nouvel utilisateur de le système Motion Control(table xy, TL78, PCI-7344, Windows2000 et LabVIEW 7.1 ). Je voudrais contrôler les table xy(MC)à partir de Labview. J'ai télécharger Motion Assistant et des programmes de test et de contrôle.