Load data to infoobject master text from csv
Dear Experts
I have infoobject master,
and in tab master data/text, I thick :
- short text,
- medium text
- text languange dependent
now, I want load data from csv to the infoobject master text
how many columns i must set in my csv ?
must I set languange column and record number column ?
really appreciate your help
Best Regards
Jeiming
hi,
the easy way is to create direct update infosurce and then the transfer rules on the text.
the system will automatically create thwe transfer rules with the fields and then u can use for for dataload.
in general u can go to text table and see the fields required.
u have to include all fields in text table in tr.
regards
Similar Messages
-
Error encountered while loading data into an Essbase cube from oracle DB
Hi All,
I am trying to load data into an Essbase cube from Oracle db. Essbase is installed in an 64 bit HP-UX machine and oracle database on HP -UX 11.23 B server. While performing the above I am getting the below error messages:
K/INFO - 1021013 - ODBC Layer Error: [08001] ==> [[DataDirect][ODBC Oracle Wire Protocol driver][Oracle]ORA-01034: ORACLE not available
ORA-27101: shared memory realm does not exist
HP-UX Error: 2: No such file or directory].
OK/INFO - 1021014 - ODBC Layer Error: Native Error code [1034] .
ERROR - 1021001 - Failed to Establish Connection With SQL Database Server. See log for more information.
Can you please help me out in identifying the issue, as why it could have occured , because of network problem or is there anything related to databse?
Regards,
PriyaThe error is related to Oracle, I would check that it is all up and running
ORA-01034: ORACLE not available
Cheers
John
http://john-goodwin.blogspot.com/ -
Load Data with 7.0 DataSource from Falt file to Write Optimized DSO
Hi all,
we have a problem loading data from flat file using the 7.0 datasource.
We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
Has anyone any tips to help me?
Thank you for help.
Regards
EmilianoHi,
Iam facing the similar problem.
Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
But, its picking up the data from 3 other reqests and doubling the records...
Can you please help me, how did you managed to get out of that isue?
Cheers,
Nisha -
Problem loading data to Hyperion Planning app. from Oracle EBS using ERPi
I'm using ERPi to load data to Planning application from EBS.
I have configured Source and Target, created import formats, location, data load rules etc.
When I'm running data load rule it is showing that data is imported and exported sucessfully. I can see data in the right format in data load workbench. ODI shows process completed sucessfully. But still there is no data loaded in Planning.
Even AIF_HS_BALANCES staging view in ERPi repository shows correct data.
Can some one tell me what could be the problem.
Or where I can see logs for the data load step for Planning.
ODI operater logs are not very clear to me. Same for log tables in ODI work repository.Hi Tony
Apologies for the delay (my email spammed the notification that you had replied, helpful!!). When you say that you have done it but discourage accessing the tables directly how did you achieve this? Were you just careful which data intersections your selected from the tables (e.g. always base level data so it is stored not calculated)?
I have used the HFM relational tables a couple of times to make use of the metadata that they contain but not tried to get data before.
Regards
Stuart -
Column Mapping while loading data via tab delimited text file
While attempting to load data I get an error message at the "Define Column Mapping" step. It reads as follows:
1 error has occurred
There are NOT NULL columns in SYSTEM.BASICINFO. Select to upload the data without an error
The drop down box has the names of the columns, but I don't know how to map them to the existing field names, I guess.
By the way, where can I learn what goes in "format" at this same juncture.
Thanks!You can use Insert Into Array and wire the column index input instead of the row index as shown in the following pic:
Just be sure that the number of elements in Array2 is the same as the number of rows in Array1.
Message Edited by tbob on 03-07-2006 11:32 AM
- tbob
Inventor of the WORM Global
Attachments:
Add Column.png 2 KB -
Loading 361000 records at a time from csv file
Hi,
One of my collegue loaded 361000 records from one file file , how is this possible as excel accepts 65536 records in one file
and even in the infopackage the following are selected what does this mean
Data Separator ;
Escape Sign "
Separator for Thousands .
Character Used for Decimal Point ,
Pls let me knowhi Maya,
it just possible, other than ms-excel, we have editor like Textpad that support more 65k rows (and windows Notepad), the file may be generated by program or edited outside in excel, or newer version of excel is used, ms-excel 2007 support more 1 million rows.
e.g we have csv file
customer;product;quantity;revenue
a;x;"1.250,25";200
b;y;"5.5";300
data separator ;
- char/delimeter used to separate field, e.g
escape sign, e.g
- "1.250,25";200 then quantity = 1.250,25
separator for thousands = .
- 1.250,25 means one thousand two hundred ...
char used for decimal point
- - 1.250<b>,</b>25
check
http://help.sap.com/saphelp_nw70/helpdata/en/80/1a6581e07211d2acb80000e829fbfe/frameset.htm
http://help.sap.com/saphelp_nw70/helpdata/en/c2/678e3bee3c9979e10000000a11402f/frameset.htm
hope this helps. -
We have had Oracle RAC 2 Groups (master and slave).
We have created materialized from slave to master. and I have created materialized view log on master .
When slave group job is broken a long time(12 Hrs). We refreshed job and runned job id.
But I have found database on materialized view (slave) different from TABLE on master group. (m$log =0row)
Please tell me how can we resolve Or because that problem is m$logYes I do have a unique identifier on each record in the base table and therefore also in the MV.
So now the Downstream application has to update some of its own database tables based on what is in the MV that I provide - just so I understand how would that work e.g.:
1. MV gets refreshed from my base table every minute
2. Some records are newly inserted and some simply updated
and this is the part I don't understand (I might be missing something obvious)....
3. Downstream application interrogates the MV every (say 1-2 minutes) and updates its own tables based on the data there.....but let's say MV contains 10,000 records and in the last minute 100 of those records have been updated and another 100 inserted - how will it know the 200 records it needs to pull?
Thanks again -
Issue with deleting text from CSV file in Numbers
We are converting an Exel File into a CSV File by way of Numbers, in an effort to upload contacts into my Contacts database on my MacBook Air. The Contacts database does not like the CSV file and I am pretty sure it is because there is text in the phone number column. So a cell may read
"Office: 323.555.1212"
Before I change 600+ cells by manually taking out the word "cell" or "office" from these cells, I wonder if there is a way to format the column such that the text is ommitted. Thanks!Hi gergenson,
Find and Replace in Numbers.
Edit > Find > Show Search (command f) will bring up a search box:
Click on Find and Replace.
In the Find box, type Office: followed by two spaces (there are two spaces in your example).
Leave the Replace box empty.
Click on Replace All.
Repeat this for Cell.
That editing process will replace the Find text with nothing, and will leave the rest of the cell contents unchanged.
Regards,
Ian. -
How to Load Master Data Text from Multiple Source Systems
Situation: Loading vendor master (text) from two systems. Some of the vendor keys/number are the same and so is the description. I understand that if I was loading Attributes, I could create another attribute such a source system id and compound it to 0Vendor. However, I don't believe you can do the same when loading text.
Question: So, how can I load duplicate vendor keys and description into BW using the *_TEXT datasource so they do not overwrite?
Thanks!Hi,
Don't doubt, it'll work!
Sys ID as compound to 0Vendor.
If you doubt about predefined structure of the *_TEXT datasource and direct infosource, then use a flexible infosource. Maybe you'll have to derive the sys ID in a routine or a formula.
Best regards,
Eugene -
BI 7.0 - Error while loading data from R3 to PSA
Hi,
I transported Master Data ATTR/TEXT object from DEV to QA (BI 7.0 NW 2004s); and getting the following error while loading the data from R3 in to PSA using the InfoPackage.
Any clue to resolve this issue is appreciated.
Monitor - STATUS tab:
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Service API .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Monitor - Details Tab:
Extraction (messages): Errors occurred
Error occurred in the data selection
Request IDoc : Application document not posted
Error Message:
Diagnosis
In the source system, there is no transfer structure available for
InfoSource 0ACTIONTYPE_TEXT .
System Response
The data transfer is terminated.
Procedure
In the Administrator Workbench, regenerate from this source system the
transfer structure for InfoSource 0COMPANY_TEXT .
Thanks,Hi,
Pl try the below
The l_dta data target that you want to load is an InfoObject (master data or text
table).
You must declare this for the InfoProvider in Transaction RSD1.
You do this on the 'Master Data/Text' tab by assigning any InfoArea to the InfoObject.
The InfoProvider (that is, this InfoObject) is then displayed below this InfoArea in
the InfoProvider tree.
Rregards,
Senthil -
Loading data from infopackage via application server
Hi Gurus,
I have a requirement where i need to load data present in the internal table to a CSV file in the application server (AL11) via open data set, and then read the file from the aplication server, via infopackage ( routine ) then load it to the PSA.
Now i have created a custom program to load data to AL11 application server and i have used the below code.
DATA : BEGIN OF XX,
NODE_ID TYPE N LENGTH 8,
INFOOBJECT TYPE C LENGTH 30,
NODENAME TYPE C LENGTH 60,
PARENT_ID TYPE N LENGTH 8,
END OF XX.
DATA : I_TAB LIKE STANDARD TABLE OF XX.
DATA: FILE_NAME TYPE RLGRAP-FILENAME.
FILE_NAME = './SIMMA2.CSV'.
OPEN DATASET FILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
XX-NODE_ID = '5'.
XX-INFOOBJECT = 'ZEMP_H'.
XX-NODENAME = '5'.
XX-PARENT_ID = '1'.
APPEND XX TO I_TAB.
XX-NODE_ID = '6'.
XX-INFOOBJECT = 'ZEMP_H'.
XX-NODENAME = '6'.
XX-PARENT_ID = '1'.
APPEND XX TO I_TAB.
LOOP AT I_TAB INTO XX.
TRANSFER XX TO FILE_NAME.
ENDLOOP.
now i can see the data in the application server AL11.
Then in my infopackage i have the following code,
form compute_flat_file_filename
using p_infopackage type rslogdpid
changing p_filename like rsldpsel-filename
p_subrc like sy-subrc.
Insert source code to current selection field
$$ begin of routine - insert your code only below this line -
P_FILENAME = './SIMMA2.CSV'.
DATA : BEGIN OF XX,
NODE_ID TYPE N LENGTH 8,
INFOOBJECT TYPE C LENGTH 30,
NODENAME TYPE C LENGTH 60,
PARENT_ID TYPE N LENGTH 8,
END OF XX.
DATA : I_TAB LIKE STANDARD TABLE OF XX.
OPEN DATASET P_FILENAME FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF SY-SUBRC = 0.
DO.
READ DATASET P_FILENAME INTO XX.
IF SY-SUBRC <> 0.
EXIT.
ELSE.
APPEND XX TO I_TAB.
ENDIF.
ENDDO.
ENDIF.
CLOSE DATASET P_FILENAME.
P_SUBRC = 0.
i have the following doubt,
while loading the data from internal table to application server, do i need to add any "data seperator" character and "escape sign" character?
Also in the infopackage level i will select the "file type" as "CSV file", what characters do i need to give in the "data seperator" and "escape sign" boxes? Please provide if there is any clear tutorial for the same and can we use process chain to load data for infopackage using file from application server and this is a 3.x datasource where we are loading hierarchy via flat file in the application server.
Edited by: Raghavendraprasad.N on Sep 6, 2011 4:24 PMHi,
Correct me if my understanding is wrong.. I think u are trying to load data to the initial ODS and from that ODS the data is going to t2 targets thru PSA(Cube and ODS)....
I think u are working on 3.x version right now.. make sure the following process in ur PC.
Start process
Load to Initia ODS
Activation of the Initial ODS
Further Update thru the PSA(which will update both ODS and Cube).
make sure that u have proper Update rules and Init for both the targets from the Lower ODS and then load the data.
Thanks -
Error while Loading data through .csv file
Hi,
I am getting below date error when loading data through into Olap tables through .csv file.
Data stored in .csv is 20071113121100.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
... nl:ERROR(u:'transformation error')).
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
Any help is greatly appreciated.
Thanks,
Poojak1) Wrong format, you wont get much support loading OLAP cubes in here I think.
2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
*** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
Expression in Informatica is setup as below.
IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
Thanks,
Poojak -
Loading data to a cube from a DSO and a Flat file
Hi All,
I have an Infocube with fields
product
plant
customer
quantity
country
I am trying to load data into it , few fields from a DSO and others from a flat file
e.g plant and country from a dso
and product, customer and quantity from a flat file.
I have created 2 transformations,
one from the DSO -> cube (which works)
another transfromation does not get activated -- it gives an error msg saying (no source fields are assigned to the rule)
Is it possible to make a load of this sort? If so, then why is the transformation giving that error?
Please do help!
Thanks and Regards,
RadhikaDear Friend,
Not sure what have you done, it should be like this
datasouce---DSO1--
Cube1
Flat2--DSO(optional)-Cube2
then you should build a multiprovider on top of these cubes (cube1 and cube2 ) and then create Query.
Please check if this is something you have done.
Hope this helps.
Thanks
sukhi -
Load data from dimensions to fact table
Hi,
I have 10 dimension tables and one fact table. now i want to load data in to fact table from dimensions table. how i can do that.
Regards,
joyTo elaborate on "Mallee"'s answer, it's not good practice to load dimensional data into a fact table.
I guess what you mean is how to refer to dimensions when loading a fact table.
Simply refer to dimensions using lookups on the necessary dimensions, the lookup needs to be on the natural key you have that links facts to dimensions. Don't forget to use the dimension's surrogate key though when populating the fact table's foreign key columns pointing at the dimensions.
Hope this helps.
Cheers, Patrick -
Can I use LabVIEW to load data directly into system memory from a VI? The serial card I'm using isn't supported by NI nor does VISA recognize it. I'm using a Call Library function to read the data from the card and now I want it to go directly to system memory.
The data is being received at 1Mbps.
ThanksTwo questions:
One, if it's a serial card, then presumably it gives you more serial ports, like COM3, COM4, etc. If so, VISA would see the COM ports, and not the card directly. The drivers for the card should make it so that you see the extra serial ports from the OS. If you don't see the extra COM ports from VISA, then it sounds like the drivers for the card are not installed properly. Do the extra COM ports show up in Device Manager?
Two, you said that you're using a Call Library function to get the data and you want to put it into system memory. Errr.... you just read the data and you have it in memory by definition. Are you saying you need a way to parse the data so it shows up on a graph or something?
Maybe you are looking for
-
I followed the steps to restore to factory settings now I can't get back my info from a prior backup. My pads screen was frozen on the connect to iTunes I needed to restore. I had to setup my pad as if it was new. How do I acess my prior backup
-
Tabular form - how to set a field to a constant?
I'm using a tabular form to update a table - and this works. Then I remembered that I need to also update the field that keeps track of who made the change. Setting this field via the SELECT (e.g. Select 'userX' as editor from ...) for the tabular fo
-
Oracle.jdbc.driver.OracleTypes is not visible
We are migrating one application from Weblogic 8.1 SP6 to Weblogic 10.3. Earlier in Weblogic 8.1 SP6 we have been used oracle.jdbc.driver.OracleTypes class which is there in ojdbc14.jar. But when i am trying use same class in Weblogic 10.3 getting er
-
How to Maximize Memory usage for JDK 6 on Solaris 10.
Hi there, I deployed jdk 6 & Tomcat 6 running on Solaris 10 with 8GB of RAM Sun Fire x64 x4100 M2 m/c. Its relatively fast. However, the java could allowed me to set the Max Java OPT to only 3GB. I don't understand and there isn't any extra applicati
-
Select Option header cleared after FM call
Hi I'm calling an FM in the AT SELECTION SCREEN event in which I'm using some FMs to validate the values entered by the user. However on returning from the FM the header of the select option gets cleared. And the value you entered on the select optio