Designing relational database from flat file
hi all,
I have little background in the filed of databases. As my work task I have to design a realtional databse from a flat file with 60 fields. would you please guide me how to or, the documents that i can refer to develop a good and efficient design.? I would appreciate your help and would love to work and read and learn to any level to get it right.have a good day
user501335 wrote:
Thank you very much Gerwin and Edstvens for your replies. I guess I must have elaborated a little more.
Gerwin, I have gone throught the link you have sent me, and it was good to know that I am already aware of till this level.. Thank you for the help, can you please suggest little more so that i can go a little advance. we would be developing it for sql server2008.
edstevens, So kind of you to send me a good suggestion.I will keep in mind while working on it.Would you please like to suggest me something more?
No.
If people would just get Third Normal Form burned into their brains, a lot of other problems would go away.
wish you both a good day
Similar Messages
-
How to upload the BP master details in huge amount from flat file to CRM database
Hi,
Could you any body please help me initially the best method to upload Business partner data from flat file and if possible with any sample code available?
Basically I am an ABAP consultant. In ERP I used to do BDC or lsmw in generally?
This is a mission critical thing for me.Hi Chitturi,
I did not come across the scenario for custom include fields from BAPI of partner creation. But if the fields are not appearing in the import/tables structure fields then you probably need to update the corresponding DB Table for those fields separately,else BAPI will update the same. You will get partner guid and partner number from export parameters of the BAPI.
Regards,
Dipesh
Message was edited by: Joaquin Fornas -
Error while loading table from flat file (.csv)
I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
SQL*Loader-552: insufficient privilege to open file
SQL*Loader-509: System error: The data is invalid.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
I believe that this is related to SQL * Loader error.
ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
Please suggest
Is it required that i need to place the flat file in Oracle Server System ??
Regards,
Ashoka BLHi
I am getting an error as well which is similar to that described above except that I get
SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
The difference is that Ashoka was getting
SQL*Loader-552: insufficient privilege to open file
and I get
SQL*Loader-553: file not found
The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
Also in the error message is
Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
Character Set WE8MSWIN1252 specified for all input.
Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
The systest and development databases are on the same box.
Any help would be much appreciated
Thanks
Edwin -
Loading data from flat file...
Hello,
I am actually experiencing a problem where I cant load data from a flat file into a table. I have actually reverse engineered the flat file into ODI. But the thing is I don't know how to load the data that is reversed into a RDBMS table. Added, I don't know how create this RDBMS table form within ODI to be reflected on the DB. Added, don't know how to load the data from the flat file onto this table without having to add the columns in order to map between the flat file and the table.
In conclusion, I need to know how to create an RDBMS table from within ODI on the database. And how to automatically map the flat file to the DB table and load the data into the DB table.
Regards,
HossamHi Hossam,
We can used ODI procedure to create table in the DB.
Make sure you keep the column name in the table name same as the column name in FLAT FILE so that it can automatically map the column.
and regarding Loading data from FLAT File i.e. our source table is FLAT FILE till ODI 10.1.3.4 we need to manually insert the datastore since the file system cannot be reversed.
Please let me know Hossam if i can assis you further.
Thanks and Regards,
Andy -
Loading transaction data from flat file to SNP order series objects
Hi,
I am an BW developer and i need to provide data to my SNP team.
Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
thanks in advance
RahulHi,
Please go through the following links:
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
Hope this helps...
Regards,
Habeeb
Assign points if helpful..:) -
Va31 shedule line agreement data upload from flat file
Hi abapers
I have to upload some data (va31) from flat file to my database (shedule line agreement data) I am using user exit for it...Cant get which user exit will solve the purpose and where to check it from..I tried using SDTRM001 , meeta001 and and the va45A series but its not working. I used break point on these user exits but its not stoping at break point.
Can any one help me where to find which user exit will work in this case?
Thanks in Advance
AnnuHi Prash,
Check these posts:
Re: Increasing the length of Infoobject from 60 to 240 characters
Re: InfoObject > 60
Bye
Dinesh -
Wrong century for the dates from flat file
I am in the process of uploading the data from flat file ( aka CSV) into oracle via external table .
All the dates have the year of 20xx. My understanding was 51-99 will be prefixed with 19 ( such as 1951 - 1999) and 00-50
will be prefixed with 20 ( such as 2004 ... )
The column below ( startdate ) is of timestamp .
Is my understanding wrong ?
SQL> select startdate , to_date(startdate , 'dd/mm/yyyy' ) newdate from tab;
NEW_DATE STARTDATE
09/18/2090 18-SEP-90 12.00.00.000000000 AM
BANNER
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for 64-bit Windows: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
SQL> show parameter nls
NAME TYPE VALUE
nls_calendar string
nls_comp string BINARY
nls_currency string
nls_date_format string
nls_date_language string
nls_dual_currency string
nls_iso_currency string
nls_language string AMERICAN
nls_length_semantics string BYTE
nls_nchar_conv_excp string FALSE
nls_numeric_characters string
nls_sort string
nls_territory string AMERICA
nls_time_format string
nls_time_tz_format string
nls_timestamp_format string
nls_timestamp_tz_format string
SQL>
SQL>Offense . None taken ...
I literally typed the sql ( as I did not copy & paste SQL / resultsets ) . I did use to_char in my original sql . I don't want to expose the real table . thats why I was giving the made up test case. If you carefully the SQL and the equivalent result sets ... you would have noticed .
Here is the SQL used ( of course , I have changed the table name ) ...
select to_char(startdate , 'mm/dd/yyyy') sdate, startdate from t
SDATE STARTDATE
09/18/2090 18-SEP-90 12.00.00.000000 AM
The flatfile had the value of 091890 as the data . -
Error in loading transaction dat from flat file
hi everybody
while i try to load data from flat file at time of scheduling it shows the error "check load from infosourceand its showing errorwhilw monitoring the data as follow"the correcsponding datapacket were not updated using psa" what can i do to rectify the problem
thanks in advanceHi,
Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
Check whether all the related AWB objects are active or not.
Regards,
K.Manikandan. -
Need a script to import the data from flat file
Hi Friends,
Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
Thanks.
SriniHere is my init.ora file
# $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
# Copyright (c) 1991, 1997, 1998 by Oracle Corporation
# NAME
# init.ora
# FUNCTION
# NOTES
# MODIFIED
# atsukerm 08/06/98 - fix for 8.1.
# hpiao 06/05/97 - fix for 803
# glavash 05/12/97 - add oracle_trace_enable comment
# hpiao 04/22/97 - remove ifile=, events=, etc.
# alingelb 09/19/94 - remove vms-specific stuff
# dpawson 07/07/93 - add more comments regarded archive start
# maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
# jloaiza 03/07/92 - change ALPHA to BETA
# danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
# ghallmar 02/03/92 - db_directory -> db_domain
# maporter 01/12/92 - merge changes from branch 1.8.308.1
# maporter 12/21/91 - bug 76493: Add control_files parameter
# wbridge 12/03/91 - use of %c in archive format is discouraged
# ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
# thayes 11/27/91 - Change default for cache_clone
# jloaiza 08/13/91 - merge changes from branch 1.7.100.1
# jloaiza 07/31/91 - add debug stuff
# rlim 04/29/91 - removal of char_is_varchar2
# Bridge 03/12/91 - log_allocation no longer exists
# Wijaya 02/05/91 - remove obsolete parameters
# Example INIT.ORA file
# This file is provided by Oracle Corporation to help you customize
# your RDBMS installation for your site. Important system parameters
# are discussed, and example settings given.
# Some parameter settings are generic to any size installation.
# For parameters that require different values in different size
# installations, three scenarios have been provided: SMALL, MEDIUM
# and LARGE. Any parameter that needs to be tuned according to
# installation size will have three settings, each one commented
# according to installation size.
# Use the following table to approximate the SGA size needed for the
# three scenarious provided in this file:
# -------Installation/Database Size------
# SMALL MEDIUM LARGE
# Block 2K 4500K 6800K 17000K
# Size 4K 5500K 8800K 21000K
# To set up a database that multiple instances will be using, place
# all instance-specific parameters in one file, and then have all
# of these files point to a master file using the IFILE command.
# This way, when you change a public
# parameter, it will automatically change on all instances. This is
# necessary, since all instances must run with the same value for many
# parameters. For example, if you choose to use private rollback segments,
# these must be specified in different files, but since all gc_*
# parameters must be the same on all instances, they should be in one file.
# INSTRUCTIONS: Edit this file and the other INIT files it calls for
# your site, either by using the values provided here or by providing
# your own. Then place an IFILE= line into each instance-specific
# INIT file that points at this file.
# NOTE: Parameter values suggested in this file are based on conservative
# estimates for computer memory availability. You should adjust values upward
# for modern machines.
# You may also consider using Database Configuration Assistant tool (DBCA)
# to create INIT file and to size your initial set of tablespaces based
# on the user input.
# replace DEFAULT with your database name
db_name=DEFAULT
db_files = 80 # SMALL
# db_files = 400 # MEDIUM
# db_files = 1500 # LARGE
db_file_multiblock_read_count = 8 # SMALL
# db_file_multiblock_read_count = 16 # MEDIUM
# db_file_multiblock_read_count = 32 # LARGE
db_block_buffers = 100 # SMALL
# db_block_buffers = 550 # MEDIUM
# db_block_buffers = 3200 # LARGE
shared_pool_size = 3500000 # SMALL
# shared_pool_size = 5000000 # MEDIUM
# shared_pool_size = 9000000 # LARGE
log_checkpoint_interval = 10000
processes = 50 # SMALL
# processes = 100 # MEDIUM
# processes = 200 # LARGE
parallel_max_servers = 5 # SMALL
# parallel_max_servers = 4 x (number of CPUs) # MEDIUM
# parallel_max_servers = 4 x (number of CPUs) # LARGE
log_buffer = 32768 # SMALL
# log_buffer = 32768 # MEDIUM
# log_buffer = 163840 # LARGE
# audit_trail = true # if you want auditing
# timed_statistics = true # if you want timed statistics
max_dump_file_size = 10240 # limit trace file size to 5 Meg each
# Uncommenting the line below will cause automatic archiving if archiving has
# been enabled using ALTER DATABASE ARCHIVELOG.
# log_archive_start = true
# log_archive_dest = disk$rdbms:[oracle.archive]
# log_archive_format = "T%TS%S.ARC"
# If using private rollback segments, place lines of the following
# form in each of your instance-specific init.ora files:
# rollback_segments = (name1, name2)
# If using public rollback segments, define how many
# rollback segments each instance will pick up, using the formula
# # of rollback segments = transactions / transactions_per_rollback_segment
# In this example each instance will grab 40/5 = 8:
# transactions = 40
# transactions_per_rollback_segment = 5
# Global Naming -- enforce that a dblink has same name as the db it connects to
global_names = TRUE
# Edit and uncomment the following line to provide the suffix that will be
# appended to the db_name parameter (separated with a dot) and stored as the
# global database name when a database is created. If your site uses
# Internet Domain names for e-mail, then the part of your e-mail address after
# the '@' is a good candidate for this parameter value.
# db_domain = us.acme.com # global database name is db_name.db_domain
# FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
# vms_sga_use_gblpagfil = TRUE
# FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
# adversely affect performance. On some non-VMS ports the db_block_cache_*
# debugging modes have a severe effect on performance.
#_db_block_cache_protect = true # memory protect buffers
#event = "10210 trace name context forever, level 2" # data block checking
#event = "10211 trace name context forever, level 2" # index block checking
#event = "10235 trace name context forever, level 1" # memory heap checking
#event = "10049 trace name context forever, level 2" # memory protect cursors
# define parallel server (multi-instance) parameters
#ifile = ora_system:initps.ora
# define two control files by default
control_files = (ora_control1, ora_control2)
# Uncomment the following line if you wish to enable the Oracle Trace product
# to trace server activity. This enables scheduling of server collections
# from the Oracle Enterprise Manager Console.
# Also, if the oracle_trace_collection_name parameter is non-null,
# every session will write to the named collection, as well as enabling you
# to schedule future collections from the console.
# oracle_trace_enable = TRUE
# Uncomment the following line, if you want to use some of the new 8.1
# features. Please remember that using them may require some downgrade
# actions if you later decide to move back to 8.0.
#compatible = 8.1.0
Thanks.
Srini -
Ssis - import data from flat file to table (sql server 2012)
i have create a ssis for importing data from flat file to table in sql server 2012.
but i got the below error for some column in data flow task.
error: cannot processed because than one code page (950 and 1252)
anyone helps~Hi,
The issue occurs because the source flat file uses ANSI/OEM – Tranditional Chinese Big5 encoding. When processing the source file, the flat file connection manager uses code page 950 for the columns. Because SQL Server uses code page to perform conversions
between non-Unicode data and Unicode data, the data in the code page 950 based input columns cannot be loaded to code page 1252 based destination columns. To resolve the issue, you need to load the data into a SQL Server destination table that includes Unicode
columns (nchar or nvarchar), and convert the input columns to Unicode columns via Data Conversion or the Advanced Editor for the Flat File Source at the same time.
Another option that may not be that practical is to create a new database based on the Chinese_Taiwan_Stroke_BIN collation, and load the data to non-Unicode columns directly.
Reference:
http://social.technet.microsoft.com/Forums/windows/en-US/f939e3ba-a47e-43b9-88c3-c94bdfb7da58/forum-faq-how-to-fix-the-error-the-column-xx-cannot-be-processed-because-more-than-one-code-page?forum=sqlintegrationservices
Regards,
Mike Yin
TechNet Community Support -
How to upload schedule line from flat files to sap file
dear all,
i want to upload the schedule lines from flat files to sap schedulle lines
but the flat files have 15 schedule lines and the data is as per date
so how to upload that and the fields available in flat files are more than the sap screen
we are having more than 6 items
and 15scedule lines its abt 90data to be upload
for one customer in every 15 day
so how to do this
is there any direct use in functional side
with out the help of any abap
but my user will do it
so he need a permanent solution
with regards
subratHi Subrat ,
u can upload the data either ( Master /Transaction) data with the help of lsmw. for that all u need to do is go through the lsmw and do it. in that u can go Batch input recording/ BAPI/ IDOC any of that. here i am sending the LSMW Notes go through it and do the work.
once u create the LSMW project then u can ask the data from user or u can explain the user about the program and can run the flat file to upload the data.
if u require LSMW material Just send me blank mail from u. my mail id is [email protected]
Reward if Helpful.
Regards,
Praveen Kumar.D -
Short dump while reading a currency field from Flat file into internal tabl
Hi,
I am getting a short dump........saying number conversion dump (while reading a currency value into field in internal table from a fixed lenght flat file).........
Do I need to use a string variable to get the value from flat file or how ??
Please suggest.Santosh,
Thanks for your inputs,
But my internal table type is of DEC (5,2) , I am getting that... it needs to be of type 'C'. Can you suggest.
Ex :
MOVE wa_temp-infile_string+106(8) TO wa_item-QT_PERCENT
This didnt work
so i tried moving into a seperate variable
MOVE wa_temp-infile_string+106(8) TO v_percent.
and then write to
WRITE v_percent to wa_item-QT_PERCENT. -
How to upload data from flat file to datastore object in BI 7.0
Dear friends,
Please tell me
step by step process for upload data from flat file to datastore object in BI 7.0
<removed by moderator>
please help me
Thanks,
D.prabhu
Edited by: Siegfried Szameitat on Aug 17, 2011 11:40 AMCreate transformation on thr data source and keep the DSO as the target and load.
Ravi Thothadri -
Data is not updated to Infocube (BI 7.0) when i load from flat file
Hi Experts,
when I try to load data from flat file to Infocube (BI 7.0) with full or Delta, I am gettin following errors.
1) Error while updating to data target, message No: RSBK241
2) Processed with Errors, Message No: RSBK 257.
Till transformations everthing is successful but updating data to cube is not successful...
what is the message no. indicates? can i find solution anywhere with respective to Message NO.?
Anybody please help me..
Regards
AnilHi Bhaskar,
There is no problem with cube activation.But i try to activate the cube by using
the function module " RSDG_CUBE_ACTIVATE". It goes to short dump.Anyhow i want to know
that in what scenario we use this function module to activate the cube.
There is no special characters in the file. I delete the requests in psa, cube and Re run the
Infopackage & DTP.But I am getting same error.
The following errors are found in updating menu
of details tab (display messages option)
1) Error while updating to data target, message No: RSBK241
2) Processed with Errors, Message No: RSBK 257.
Thank you
Regards
Anil
Edited by: anilkumar.k on Aug 20, 2010 8:35 PM -
Data Source creation for Master Data from Flat File to BW
Hi,
I need to upload Master Data from Flat File. Can anybody tell step by step right from begining of creation of DataSource up to Loading into Master Data InfoObject.
can any body have Document.
Regards,
Chakri.Hi,
This is the procedure.
1. Create m-data with or without attributes.
2. Create infosource .
a) with flexible update
or
b) with direct update
3. Create transfer rules and assign tyhe names of m-data and attribute in "Transfer rules" tab and transfer them to communication structure.
4. Create the flat-file with same stucture as communication structure.
5. if chosen direct update then create infopackage and assign the name of flat-file and schedule it.
6. if chosen flexible update the create update rule with assigning name of the infosource and the schedule it by creating infopackage.
Hope this helps. If still have prob then let me know.
Follow this link also.
http://help.sap.com/saphelp_nw2004s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
Assign points if helpful.
Vinod.
Maybe you are looking for
-
hi i am using JDEV 11.1.1.3 i have a task flow which has a CREATE INSERT Method flow to my page. i am using go link in my menu i want 2 call that task flow so that i get a create form on page load... i dont know how to call the task flow.. is there i
-
Unable to see effect changes in program monitor
Hi Guys, I have tried applying an effect (3 way color corrector) onto a clip. When clicking on the white balance dopper and then onto the frame in the program view, the colour of the white balance changes to that colour, but the clip itself in the pr
-
Can we use Robohelp 10 for multiple users who want to work in a single file without installing a server?
-
Mavericks is freezing multiple times per day.
This is driving me crazy. I've been using 10.6.8 for years now with almost no problems, but I finally decided to bite the bullet and upgrade to Mavericks. I thought I wanted to upgrade to Yosemite when it came out, so I figured I should start getti
-
Data type difference of char and varchar will affect the searching result?
Hi, My issue is tht i have created an item in db as char(20) while creating EO for tht particular table, data type for tht item attribute is by default coming as string. wen i search from table based on tht item, no rows are returned.So can anyone he