A problem about flat file,help me plz!
hi all:
i have met a question while working with owb.
in my case i have two flat files as the data source.
at first i use one of the files as data source .And everything was gone through smoothly,including deploy and execute.
But when i use the other file as data source,error came up when the connectors were being deployed.it reported "Could not find location XXXX ".
i wonder if there is any limitation while using flat file as data sourse.if not ,how can i deal with such a problem.
best regards!
thanks a lot!
when i deploy with source as flat files i use sqlldr which even OWB uses. and there is no problem with multiple sources in sqlldr. check in the script after generating mapping. If INFILE option in sqlldr script every path of source is mentioned properly.
regards
roopa
Similar Messages
-
Problem performing flat file reconciliation in 11gR2
Hello,
In R1, I had configured a simple GTC flat file connector that worked well. I've now updated to R2 (well actually, I've done a clean install), recreated the same connector configuration and tested with a text file that worked in R1, but this time, it doesn't work. I've added the following lines in my oim server logs config to have a better idea of what's going on :
<logger name="oracle.iam.identity.scheduledtasks"/>
<logger name="oracle.iam.scheduler"/>
<logger name="oracle.iam.platform.scheduler"/>
<logger name="Xellerate.Scheduler"/>
<logger name="Xellerate.Scheduler.Task"/>
<logger name="oracle.iam.reconciliation"/>
However, the problem seems to happen before any reconciliation task is executed because I don't see any information about the data being read from the file. Here are the errors I'm getting :
oracle.iam.reconciliation.exception.ReconciliationException: Invalid Profile - GTCCSVTEST1_GTC
at oracle.iam.reconciliation.impl.ReconOperationsServiceImpl.getProfile(ReconOperationsServiceImpl.java:2121)
at oracle.iam.reconciliation.impl.ReconOperationsServiceImpl.ignoreEvent(ReconOperationsServiceImpl.java:544)
at oracle.iam.reconciliation.impl.ReconOperationsServiceImpl.ignoreEvent(ReconOperationsServiceImpl.java:535)
Caused by: oracle.iam.reconciliation.exception.ConfigNotFoundException: Invalid Profile - GTCCSVTEST1_GTC
at oracle.iam.reconciliation.impl.config.CoreProfileManagerImpl$ProfileMarshaller.unMarshal(CoreProfileManagerImpl.java:521)
at oracle.iam.reconciliation.impl.config.CoreProfileManagerImpl$ProfileMarshaller.unMarshal(CoreProfileManagerImpl.java:504)
[org.xml.sax.SAXParseException: cvc-minLength-valid: Value '' with length = '0' is not facet-valid with respect to minLength '1' for type 'matchingRuleType'.]
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.createUnmarshalException(AbstractUnmarshallerImpl.java:315)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.createUnmarshalException(UnmarshallerImpl.java:522)
Caused by: org.xml.sax.SAXParseException: cvc-minLength-valid: Value '' with length = '0' is not facet-valid with respect to minLength '1' for type 'matchingRuleType'.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.error(Unknown Source)
It looks like something is wrong with my reconciliation profile, so I've gone to the Design Console and created a default reconciliation profile for my GTCCSVTEST1_GTC Resource Object, but it did not help.
Any ideas?
Thanks,
--jtellierThat's it. I've just learned that a Reconciliation Rule is created automatically when a field is set as "Matching Only" when configuring the connector. I had such a field in the initial configuration, but skipped this step when configuring the connector in R2.
Thanks,
--jtellier -
Encountering problem in Flat File Extraction
Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
When I use both ALPH & External together?Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
Goodday -
Problem with flat file loading/Special characters
Hi, All,
We just migrated from Version 3.0B to BI7.0, and I've a problem which I can't handle for the moment. We are in Unicode, current codepage of the server is 4103 and codepage of frontend is 4110 (from SNL1 Transaction).
I'm loading czech texts in my material, via CSV files. The CSV file is correct on my PC, I see the special characters.
The problem is on my loading, whatever Character set I choose in the Infopackage (I choose Standard or Character Set Setting User-dependant), I cannot see the good characters.
Does any one of you already encounter this kind of problem, and, if yes, how did you solve it ?
Thankshi,
check out with your flat file letter's type, if it's in lower case letter's change them in
upper case letter's. or otherwise change the infobject's type as-- select check box
corresponding to lower case letter's in infoobject.
if helpful provide points
regards
harikrishna.N -
Problem Reading Flat File from Application server
Hi All,
I want to upload a Flat File which is having a Line Length of 3000.
While uploading it to Application server , only upto 555 length it is getting uploaded .
i am using the code :
DATA: BEGIN OF TB_DATA OCCURS 0,
LINE(3000),
END OF TB_DATA.
*----Uploading the flat file from the Local drive using FM "UPLOAD".
OPEN DATASET TB_FILENAM-DATA FOR OUTPUT IN TEXT MODE." ENCODING DEFAULT.
IF SY-SUBRC NE 0.
WRITE:/ 'Unable to open file:', TB_FILENAM-DATA.
ELSE.
LOOP AT TB_DATA.
TRANSFER TB_DATA TO TB_FILENAM-DATA.
ENDLOOP.
ENDIF.
CLOSE DATASET TB_FILENAM-DATA.
What could be the problem?
Could it be due to any Configuration Problem?
Waiting for your replies.
Thanks and Regards.
SukiYour code looks OK, but you may have touble displaying the full width. Try:
WRITE: /001 TB_FILENAM-DATA+555.
(And don't forget to append your ITAB after each read.)
Rob -
Problem about .xml file from PPro CS5 to FCP with RED and P2 file.
I have create a Project in Premiere Pro CS5.
I import the RED file R3D or P2 file XMF and editing my media.
From menù File i select Export to "Final Cut Pro XML.."
I open FCP and Import file XML but the media are not reconnect.
"Warrnig: Non-critical error were found while processing an XML document.
Would you like to see a log of these error now?"
I want see the log file and...
"<file>: Unable to attach specified media file to new clip."
Where is the problem?Hi Dennis I re-format my MACPRO 8 Core and I installed Final Cut Studio Suite and CS5 Premium (no CS4)
I install Blackmagick driver Decklink 7.6.3
If I open the After Effects and setting preview card blackmagic, I see the preview in external monitor.
If I open the Premiere Pro and setting preview, I don't see the blackmagic card but the second monitor, DV....etc.
In Premiere I see the blackmagic preset, but no the preview card.
I have a second question.
About Red file I want editing in Premiere Pro and i whant color correct in Apple Color from FCP.
My problem is: my color program crash when I send file form FCP to Color (random mode)
The sequence is:
I import file red in PPro5 -- editing file --- export file xml.
Close the PPro5.
Open FCP import xml (no re-link media)
Save de project in FCP
Select sequence and send to Color.
In this moment the Apple Color crash.
I shutdown the MAC.
I power-up the MAC
Open FCP select the project and send the sequence to Color.
Color see the project but no media.
I re link the media and I editing in color my media.
Why Apple Color program crash?
Sorry for my English
Many Thanks
Distinti saluti
Gianluca Cordioli
Alchemy Studio'S di Gianluca Cordioli
Via Pacinotti 24/B
37135 VERONA
cell.:+39 3385880683
[email protected]
www.alchemystudios.it -
Problem in Flat file transaction data loading to BI7
Hi Experts,
I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
Flat file fields are:
Cust id SrepID Mat ID Price per unit Unit measure Quantity TR. Date
cu100 dr01 mat01 50 CS 5 19991001
created info objects are
IO_CUID
IO_SRID
IO_MATID
IO_PRC
IO_QTY
IO_REV
0CALDAY
When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
Await for ur solutions
Thanks
ShrinuHi Sunil,
Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
Could you please answer to my total question.
Will you be able to send some screen shots to my email id [email protected]
Thanks
Shri -
Problem in Flat File Uploading
Hi everyone
During a flat file uploading, I get a message for each record of my SKU (Stock Keeping Unit), for eg for SKU 1017023, I got a message
<b>Value '1017023 ' for characteristic PISKU is in external format RRSV 9</b>
PISKU is the field name.
The whole extraction is failing.
Kindly provide me some suggestions as soon as poosible!!
If possible mail me at
[email protected]
Regards
Gajendra SinghHi,
It seems there is a space in the value '1017023 '.You have to edit in PSA and then send it again else delete the request and change the flat file entry and reload the data.
Hope this helps
Assign points if useful
Regards,
venkat -
Date format problem in Flat file extraction
My flat file is send date like this DD/MM/YYYY HH:MM:SS how can i extract date and time to my ODS. i need date if posible time in seperate object
how can i create my info object and how to map it.
Thanks
BabuHi,
create a routine and use the statement convert timestamp. Check out the F1-help for details of the statement.
Siggi -
Question about flat file schema definitions
Hello,
I have defined a flat file schema which works fine. However, I got now a new requirement for this schema: It has to support future potential additional fields in the end of the records.
The solution I used is quit "ugly". I added an additional filler at the end of the record and configured it as "minOccurs = 0" and set Allow early termination of optional fileds to true.
This works but I don't like it.
It seems to me that there must be a property for ignoring any additional fields after the last one, so I won't need this filler field.
Does anyone familiar with such option/ property?
Thanks all.Hi David,
Min occurs and allow_early_termination options for perfectly valid to mark the last record as optional. There is no issue with approach and its not ugly. I have seen this approach being implemented in many projects without any issues.
Refer this MSDN blog on this context and what you're doing is correct.
http://blogs.msdn.com/b/skaufman/archive/2004/05/07/127899.aspx
If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply. -
Getting Problem in Flat File to IDoc Scenario.
Hi,
I am facing problem which is as follows.
Test Flat File Contain Data As Follow
Field1,Field2,Field3
Field4,Field5,Field6
Field1,Field2,Field3
Field4,Field5,Field6
Field1,Field2,Field3
Field4,Field5,Field6
Source structure has only one segment with all six fields and is repeating.
What will be the file content conversion parameters so that it will pick first 2 lines and convert it into segment in this away.(I cant create two segments)
Thanks in advance.Hi,
I agree with Vijay.. so try to create the Data type like this..
Recoredset 0...1
-----Line1 0.... unbound
--------Field1 0...1
--------Field2 0...1
--------Field3 0...1
-----Line2 0.... unbound
--------Field4 0...1
--------Field5 0...1
--------Field6 0...1
Now do your FCC
Recordset Structure ---- Line1,*,Line2,*
Recordset per Message ---- 1
Line1.fieldNames ----- Field1,Field2,Field3
Line1.fieldSeparator ---- ,
Line1.endSeparator ----- 'nl'
Line2.fieldNames ----- Field4,Field5,Field6
Line2.fieldSeparator ---- ,
Line2.endSeparator ----- 'nl'
now map the fields as per your requirement.
Regards,
Sarvesh -
Problem in flat file loading(bulk data)
Hi,
I am populating a flat file with 3lac records. I am getting error due to buffer space problem.
"Error during target file write "
around 2 lac records are getting inserted after that the above error is coming..
Now temporarily I am populating 1.5 lac and 1.5 lac in two times.
Could anyone please tell me how to solve this issue .
Thanks in Advance.I am using owb 10.2 version.
In the map source is a Oracle table, Target is a flat file.
Source table contains 3lac records.
I executed the map. During execution it is throwing the following error .
"Error during target file write "
But it has inserted 2lac records in the target file. -
Problem with flat-file to oracle
Hi all,
Flat-file has 7 columns , all are varchar
SALE_2007
C1 VARCHAR2(255),
C2 VARCHAR2(255),
C3 VARCHAR2(255),
C4 VARCHAR2(255),
C5 VARCHAR2(255),
C6 VARCHAR2(255),
C7 VARCHAR2(255)
target also has 7 columns , first four column were varchar and the last 3 columns were numeric(20,2).
SALE
TIMECODE VARCHAR2(255),
CUSTCODE VARCHAR2(255),
SALECODE VARCHAR2(255),
PRODUCTCODE VARCHAR2(255),
AMOUNT NUMERIC(20,2),
COST NUMERIC(20,2),
SALEAMOUNT NUMERIC(20,2)
I have mapping the transformation on the staging with these function (ordering by order as above)
SALE_2007.C1
SALE_2007.C2
SALE_2007.C3
SALE_2007.C4
TO_NUMBER(REPLACE(SALE_2007.C5,',',''))
TO_NUMBER(REPLACE(SALE_2007.C6,',',''))
TO_NUMBER(REPLACE(SALE_2007.C7,',',''))
I have to use replace function because in the text file there is a comma in the data eg. 1,400.35
and the ODI will error in the step insert into flow table with error
+1722 : 42000 : java.sql.SQLException: ORA-01722: invalid number+
java.sql.SQLException: ORA-01722: invalid number*
Please advise
Thank you allHi-
It seems like there is non-numeric characters present in your C5-C7 columns. Please check your source data for these columns.
Thanks,
Saravanan Rajavel -
Problem converting flat file to XML using JMS Adapter
I need to take an MQSeries message in a flat file format and convert it to XML before processing. I have configured the modules as described by the screenshot located here:
http://www.radesix.com/JMSConfig.jpg
The message is received however it isn't converting to XML. When I view the payload I get the message indicated by the screenshot located here:
http://www.radesix.com/JMSError.jpg
I am new to XI. Any ideas?For simple plain conversion here is a config, which works in our system:
(left parameter key, right parameter value, module key always the same)
Transform.Class com.sap.aii.messaging.adapter.Conversion
TransformContentType text/xml;charset=utf-8
xml.conversionType SimplePlain2XML
xml.addHeaderLine 0
xml.processFieldNames fromConfiguration
xml.documentName SA02_Identnummer
xml.documentNamespace urn:mycompany-com:logistics:DFT:HWL
xml.structureTitle SA02_Identnummer_Satz
xml.fieldNames Satzart,Identnummer,Status
xml.fieldFixedLengths 2,10,3
Be aware, that you must delete all spaces in the config, especially when you copy and paste values.
For struct. conversion the entries are a little bit more complex.
Regards
Stefan -
I need help! it is a problem about the file!
I want to do a program that can watch files in selected folder. I will copy the files to other folder if there is a new file in the folder, the program will check the folder every second.
So I got a problem. when a big file (like 120M) copy to the folder by manual, the program is running and get this file and want to copy it to the other folder, but the file is not completed file, so the program throws an exception which is java.io.FileNotFoundException. I do not want catch this exception, I want to make the program know this file can not be transfer.
how can I do? I have already use canRead() method and canWrite() method, but it does not work. How can I know this file is not completely??I'm not sure there is a way for a Java program to detect that another application is still writing the file.
Why do you say you don't want to catch the exception? You'll have to :) It seems a simple solution is to try your file copy, and if it fails, keep trying again every second or so until it works. If you try to read it and it hasn't grown in size, it isn't being written anymore and something else is wrong.
Maybe you are looking for
-
How can I update my image in a loop?
I want to use "paint( )" automaticlly to display my 3D image slice by slice. I can't understand why it only display my final slice. Below is my coeds: // press butten to display my 3D image slice by slice if(e.getActionCommand() == "Play") for(i
-
Goal: To read Material or Material & Batch or Material & Serial nos from Material DSO (ZMM_O01) and get the plant for a certain movement type. Populate plant into another DSO (ZQM_O02). I have done declaration part in start routine as follows: *****
-
I woke up the other morning and the drop down for my address bar, which had previously had a generous block for each address in memory, had changed. All of the block for remembered addresses had been compressed to half the size they originally were.
-
Hi, IOS IPS was configured to only generate alert. During testing it was observed that the IPS was reset in giving connections. log below: *Oct 10 14:30:29: %IPS-6-SEND_TCP_PAK: Sending TCP packet:(X.X.X.X:433)=>(y.y.y.y:63170),tcp flag:0x4, pak:0x21
-
Help removing an audio book from itouch
I have a new itouch and do not know how to re-move a audio book from the itouch