RFBIBL00 Flat File Creation issue
Hello All,
I'm using "RFBIBL00" to post the FI documents, and the input to the FB01 transaction screen is obtained from a flat file from AL11 directory.
The issue here is regarding the date format of the Document date and Posting date. If the date format is as YYYYMMDD in the flat file, and if I'm using the date parameter (in the default option of one's SAP own data) other than starting with YYYY, the document is not getting posted and it is throwing error saying that date format is not correct. If I have chosen the date format starting with YYYY, then the document is getting posted successfully.
Any suggestions/ideas to make the date format as independent?
Thanks,
Bharath.
Hi Bharath
I think you'll have to use the appropriate conversion exit for the same. Choose from the below options which suit you.
SAP Internal format is 'YYYYMMDD' Example: '19990123'
1. CONVERT_DATE_TO_EXTERNAL
2. CONVERT_DATE_TO_INTERNAL
3. CONVERSION_EXIT_PDATE_INPUT Conversion Exit for Domain GBDAT: DD/MM/YYYY -> YYYYMMDD
4. CONVERSION_EXIT_PDATE_OUTPUT Conversion Exit for Domain GBDAT: YYYYMMDD -> DD/MM/YYYY
5. CONVERSION_EXIT_IDATE_INPUT External date INPUT conversion exit
6. CONVERSION_EXIT_IDATE_OUTPUT External date OUTPUT conversion exit
7. CONVERSION_EXIT_LDATE_OUTPUT Internal date OUTPUT conversion exit (e.g. YYYYMMDD)
8. CONVERSION_EXIT_SDATE_INPUT External date (e.g. 01.JAN.1994) INPUT conversion exit
9. CONVERSION_EXIT_SDATE_OUTPUT Internal date OUTPUT conversion exit (e.g. YYYYMMDD)
The date coming in from the flat file will certainly tick to one particular format i guess.
Hope this helps.
Harsh
Similar Messages
-
FBS1 and RFBIBL00 Flat File Creation
Hello All,
I am creating a file for processing by RFBIBL00 I am using the data structures that are supposed to be used by the program to consume the file created. They are:
data: lv_session TYPE BGR00, - Session Var
lv_hdr TYPE BBKPF, - Header
lv_rec TYPE BBSEG, - Detail
the file gets created and consumed fine by RFBIBL00 however, when I go to sm35 to complete the Session and it goes through the FBS1 screen the formats of the data are not correct?
The dates must be created in the format per the structure which is: 06232009
when the session is going through validation is complains that the format is not correct?
It wants the date to be in this format 06.23.2009 but, I cannot put the '.' in because the length of the variable for the date is only 8 long
Anyone else had this issue?
Thanks>
keith warnock wrote:
> When I pass the date of 20090623, the session gets created OK.
>
> When you run the session the date appears on screen as so: '20.09.0623'
>
> The error is: ' For the ML Fiscal year variant, no period is defined for 20.09.0623'
Hello Keith,
Just try to pass the date in the format DDMMYYYY & check.
BR,
Suhas -
Hello All,
I am trying to an extract using flat file method in BI 7.0. I have some 150 fields in my CSV file but i wanted is just about 10 which are scattered around in the CSV file.When i Read Preview Data in the Preview tab i see incorrect data in there.And that too all under one tab, basically all under one field , though in the extraction tab for Data Seperator i have been using ; and also tried checking the HEX box, for escape i have the ", tried HEX for this aswell.For rows to ignore i have 1.
One thing i would like to know is how will the BI infoobject know where is the position of the flat file field in the CSV file and where will it be mapped.i know it can be mapped in the Transformations but that is from the flat file datasource, i am asking about the CSV file.
Please help me out and tell me what am i doing incorrectly.
Thanks for your help.Points will be assigned.hi,
use ,and ; as the escape signs.
system takes care of it when u speicfy the path and name of the file and format as CSV.
always the system checks the one to one mapping of the falt file fields with the infoobject in the datasource.
options for u
1. arrange the neccessary fields in the flat file that exactly maps with infoobjects for mapping. then start loading.
2. keep as such and load with scattered field and in transformation map the required fields alone.
second option consumes more memory space unneccessarily.
For BI 7.0 basic step to laod data from flat (excel files) files for this follow the beloww step-by step directions ....
Uploading of master data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
BW 7.0
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
5. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
6. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
7. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
8. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
Ramesh -
I am attempting to perform a fairly standard operation, extract a table to a flat file.
I have set all the schemas, models and interfaces, and the file is produced how I want it, apart from one thing. In the source, one field is 100 characters long, and in the output, it needs to be 25.
I have set the destination model to have a column physical and logical length of 25.
Looking at the documentation presented at http://docs.oracle.com/cd/E25054_01/integrate.1111/e12644/files.htm - this suggests that setting the file driver up to truncate fields should solve the issue.
However, building a new file driver using the string 'jdbc:snps:dbfile?TRUNC_FIXED_STRINGS=TRUE&TRUNC_DEL_STRINGS=TRUE' does not appear to truncate the output.
I noticed a discrepancy in the documentation - the page above notes 'Truncates strings to the field size for fixed files'. The help tooltip in ODI notes 'Truncates the strings from the fixed files to the field size'. Which might explain the observed lack of truncation.
My question is - what is the way to enforce field sizes in a flat file output?
I could truncate the fields separately in each of the mapping statements using substr, but that seems counter-intuitive, and losing the benefits of the tool.
Using ODI Version:
Standalone Edition Version 11.1.1 - Build ODI_11.1.1.5.0_GENERIC_110422.1001bump
If this is an elementary issue, please let me know what I've missed in the manual. -
Hi,
A newbie to SSIS but was able to create an SSIS package which extracts XML data from one of the SQL server columns using "Execute SQL Task" and passes that to a for each loop container which contains an XML task for apply transform to each input
xml and append it to a flat file(additonally using a script task within for each container).
All good so far but now I want to apply conditional splitting to "Execute SQL Task" in the above and additionally pipe it to the exact similar additional process(For each container doing xml task as described above) for a different kind of
flat file.
If I alter the design to use the data flow approach and apply OOTB conditional split, I run into not knowing how to connect and execute more than one foreach container and embedded XML and script task (dataflow to control flow connection)
It is easy to put everything in a sequence container and repeat the Execute SQL Task . But to me that is very inefficient.
Any creative ideas or pointers to some internet content which tells me how can this be done most efficiently.
Hope my question makes sense and let me know if you need more clarification.
Thanks in advance.
SMAs I understand what you're asking for is a way to create conditional branches to do different typeof processing for each subset of data. For this you can do like below
1. Add set of tasks to additional processing for each type of flat file and link them to Execute sql task.
2. Make the precedence constraint option as Expression And Constraint for each of them. Make constraint as OnSuccess and expression based on your condition. You may need to create SSIS variables to capture value of fields to be used in expression manipulation.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Hi,
We needs to read a flat file and transform it to destination xml format. Then send it to destination file location.
Steps we have done:
1. Create JCA adapter and configure the flat file schema
2. Create proxy based on the jca
3. create transformation for the source to target schema (this has no namespace)
4. Create BS for sending the output
Everything workins as expected when testing from OSB test console. But then the file is placed in the source folder, the output xml has the namespace xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" to the root node.
e.g,
<Root xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" >
<Child>
<Child1/>
<Child2/>
<Child3/>
</Child>
</Root>
But expected output is
<Root>
<Child>
<Child1/>
<Child2/>
<Child3/>
</Child>
</Root>
We tried converting the xml to string then repalcing the xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" value with balnk.
Also we tried with hadcorded xml using assign action instead of transformation. Even the harded xml also having the namespace to the root node.
But the end system is failing due to this namespace value.
Help me to resolve the issue.
Thanks,
Vinoth
Edited by: Vinoth on Jun 8, 2011 10:12 PMIdeally your endsystem should not fail if you specify any number of namespace identifiers in the XML unless you are using them in elements or attributes within the XML. They should try to resolve this on their side.
But to see whats going on in OSB, can you please paste log the $body variable in the before the publish action and paste the content here. Or send the sbconfig of the Proxy and business to me on my email mentioned in the profile if possible. -
Hello
I'm trying to load a master data infoobject with a flat file and getting errors.
The infoobject ZCOLLECT_DATE has two fields - ZQUARTER and ZCOLLECTDATE
ZCOLLECT_DATE is of Data type NUMC, Length 5, conversion routine PERI5 and Output length 6
I created a flat file in Excel the following way:
Left the first row blank
Second column - 1/2008 (for ZQUARTER)
Thrid column - 01/11/2008 (for ZCOLLECTDATE)
I saved it as a CSV (save as-> csv).
In the infopackage, I've selected this file in the External data tab, File type CSV, Data separator ; , Escape sign "
When I try to load the infoobject, I get the following error:
"Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
I thought this was because the Length in my infoobject was 5, whereas ZQUARTER is of length 6 in my flat file (eg-1/2008) and ZCOLLECTDATE is of format mm/dd/yyyy.
Any thoughts on what the issue is?
Thanks
AlexHi Hemant
I changed it to 12008, 22008 etc.. but still get the error ""Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
Also, when I save the xls file as CSV, I get two pop-ups saying some features are not compatible and only those features that are compatible will be saved. Are these normal warnings, or could they be causing the issue?
Also, in the infoobject, since it only consists of dates, do I need to specify conversion routine PERI5 in Infoobject properties/maintenance?
Thanks -
Flat File Load Issue - Cannot convert character sets for one or more charac
We have recently upgraded our production BW system to SPS 17 (BW SP19)
and we have issues loading flat files (contain Chinese, Japanese and
Korean characters) which worked fine before the upgrade. The Asian
languages appear as invalid characters (garbled) as we can see in PSA. The Character Set
Settings was previously set as Default Setting and it worked fine until
the upgrade abd we referred to note 1130965 and with the code page 1100
the load went through (without the Cannot convert character sets for one or more characters error) however the Asian language characters will appear
as invalid characters. We tried all the code pages suggested in the note e.g.
4102, 4103 etc on the info packages but it did not work. Note that the
flat files are encoded in UTF-8.I checked lower case option for all IO
When i checked the PSA failed log no of records processed is "0 of 0" my question is with out processing single record system is througing this error message.
When i use same file loading from local workstation no error message
I am thinking when I FTP the file from AS/400 to BI Application Server(Linex) some invalid characters are adding but how can we track that invalid char?
Gurus please share your thoughts on this I will assign full points.
Thanks, -
Hi All,
I am facing a problem while doing a deta load through Flat File; it's an .csv file.
Problem arises whne I have some records like this £1,910.00 ; it's creating a separate column.
I am having lot's of such amount field.
is there any breakthrough for this issue ?? Your ans would be highly appriciated.
Regards,
Kironmoy BanerjeeHi,
As satish has mentioned it is better to maintain two seperate columns as below
1.0currency
GBP
2.0amount
1910
(GBP )Indicates British Pound.
When ur maintaining amount as 1,910.00 it will go for next column because in info package Extraction tabe Data Separator will be , (Comma).
Regards
Surendra
Edited by: vaade surendra on May 7, 2010 11:37 AM
Edited by: vaade surendra on May 7, 2010 11:37 AM
Edited by: vaade surendra on May 7, 2010 11:38 AM
Edited by: vaade surendra on May 7, 2010 11:41 AM -
Oracle API for Extended Analytics Flat File extract issue
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
-
Urgent: Flat file load issue
Hi Guru's,
Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
Thanks
MKHi Bhanu,
How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
In PSA I checked the error message it shows 2 error messages:
First Error message:
The value '1# ' from field /BIC/ZACTDAYS is not convertible into the
DDIC data type QUAN of the InfoObject in data record 7 . The field
content could not be transferred into the communication structure
format.
em Response
The data to be loaded has a data error or field /BIC/ZACTDAYS in the
transfer structure is not mapped to a suitable InfoObject.
The conversion of the transfer structure to the communication structure
was terminated. The processing of data records with errors was continued
in accordance with the settings in error handling for the InfoPackage
(tab page: Update Parameters).
Check the data conformity of your data source in the source system.
On the 'Assignment IOBJ - Field' tab page in the transfer rules, check
the InfoObject-field assignment for transfer-structure field
/BIC/ZACTDAYS .
If the data was temporarily saved in the PSA, you can also check the
received data for consistency in the PSA.
If the data is available and is correct in the source system and in the
PSA, you can activate debugging in the transfer rules by using update
simulation on the Detail tab page in the monitor. This enables you to
perform an error analysis for the transfer rules. You need experience of
ABAP to do this.
2nd Error Message:
Diagnosis
The transfer program attempted to execute an invalid arithmetic
operation.
System Response
Processing was terminated, and indicated as incorrect. You can carry out
the update again any time you choose.
Procedure
1. Check that the data is accurate.
- In particular, check that the data types, and lengths of the
transferred data, agree with the definitions of the InfoSource,
and that fixed values and routines defined in the transfer rules
- The error may have arisen because you didn't specify the
existing headers.
2. If necessary, correct the data error and start processing again.
Thanks
MK -
Hi All,
I have to load a flat file of the type Text (Tab Delimited) (*.TXT) in BI. How can i do that.
It is coming from an external system and hence i cannot have the format change. What should i do in
my flat file data source to be able to upload it
Thanks
Rashmi.Hi Rashmi,
There can be two formats of flat file uploading ...
one is ascii which means fixed length format .... in which there is no need of delimitation .... according to fixed length each value will go into corresponding field...
the other is the csv (comma seperate value) which the fields are seperated by a commaa...
as u are saying the input file is tab delimited u need to change the file format into any of the above types before u load into bi...
for if u are using a process chain u has to manually convert this file format and then load in the application server .
for if u are loading manaully u can edit the file to this format and load from ur work station.......
Regards
vamsi
Edited by: vamsi talluri on Feb 6, 2009 5:36 AM -
Hi all,
I'm doing a IDOC to Flat file scenario, now after the mapping and FCC I have an output in this way:
SOLD_TO,SHIP_TO,BILL_TO,0000300623,0000300622,0000500569
But I want it in this way:
SOLD_TO,0000300623,SHIP_TO,0000300622,BILL_TO,0000500569
What should I do in FCC to get the desired output?
Thanks,
SrinivasSuman,
MT
DT
ADDR_TYPE
EXT_ADDR_ID
These two fields I have duplicated twice each in the mapping:
MT
DT
ADDR_TYPE
SOLD_TO
ADDR_TYPE
SHIP_TO
ADDR_TYPE
BILL_TO
EXT_ADDR_ID
0000300623
EXT_ADDR_ID
0000300622
EXT_ADDR_ID
0000500569
Srini -
Flat File Reconciliation Issue
Hi All,
I am trying to achieve the flat file reconcilation. For that, I created a GTC connector with following configuration
Staging Directory (Parent identity data): /home/GTC
Archiving Directory: /home/GTC/archive
File Prefix: PLC
Specified Delimiter: |
File Encoding UTF8
Source Date Format yyyy/MM/dd hh:mm:ss z
*Reconcile Deletion of Multivalued Attribute Data check box [cleared]*
Reconciliation Type Full
In the Configuration Mapping, I created a Status varaible and mapped it to OIM Object Status and changed the Data type of Last Logon TimeStamp to Date.
The Layout of flat file PLC.csv.txt is like this:
Account Name|Full Name|Domain|Last Logon Timestamp|Description|GUID|Mail|Employee ID|First Name|Last Name
PLC\!alders|Steve Alder|PLC.COM|2007/01/10 11:16:27|Directory and Messaging Services|E109B9F8-40BD-4E72-B336-B46600A5B38E|||Steve|Alder
PLC\!lewisj|Jonathan Lewis|PLC.COM||Data Centre Scheduled Activities Team|1D580887-EDEB-4C87-A079-837AFBAA782F|||Jonathan|Lewis
The reconciliation is working fine however the record without date is visible in Reconciliation manager. The other record (with date) is not being displayed in Reconcliation Manager.
I can't see any error in logs also. Not sure but this might be due to changing Data type from string to Date of Last Logon TimeStamp while Configuring Reconcliation Mapping. However, I need to have Last Logon TimeStamp as Date.
Please help.
Cheers,
SunnyHi,
Ok, will try removing the 'z'. These are the new logs:
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProvider......2+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],validationProviderclassname:com.thortech.xl.gc.impl.validation.IsFloatValidatorProvider, name:IsFloat+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],nameIsFloat+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProvider......2+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],validationProviderclassname:com.thortech.xl.gc.impl.validation.IsDoubleValidatorProvider, name:IsDouble+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],nameIsDouble+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProvider......2+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],validationProviderclassname:com.thortech.xl.gc.impl.validation.IsInRangeValidatorProvider, name:IsInRange+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],nameIsInRange+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProvider......2+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],validationProviderclassname:com.thortech.xl.gc.impl.validation.MatchRegexpValidatorProvider, name:MatchRegularExpression+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],nameMatchRegularExpression+
+DEBUG,24 Aug 2009 19:49:38,211,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProvider......2+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],validationProviderclassname:com.thortech.xl.gc.impl.validation.ValidateDateFormat, name:ValidateDateFormat+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],nameValidateDateFormat+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],come in validationName.equalsIgnoreCase(name)+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProviderClassName. ..found transformation provider.....+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProviderClassName....provider class name = ..com.thortech.xl.gc.impl.validation.ValidateDateFormat+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION], provider nameValidateDateFormat+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION], provider def attribnull+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION], provider resp codes{}+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],inside getProviderClassName. ..found transformation provider.....com.thortech.xl.gc.impl.validation.ValidateDateFormat+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.ADAPTERS],Class/Method: tcADPClassLoader/getClassLoader entered.+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.ADAPTERS],Class/Method: tcADPClassLoader/getClassLoader left.+
+DEBUG,24 Aug 2009 19:49:38,212,[XELLERATE.GC.PROVIDERREGISTRATION],Loading Provider Class -->com.thortech.xl.gc.impl.validation.ValidateDateFormat+ +WARN,24 Aug 2009 19:32:54,225,[XELLERATE.GC.FRAMEWORKRECONCILIATION],Record failed on Validation and therefore would not be Reconciled+*
+WARN,24 Aug 2009 19:32:54,225,[XELLERATE.GC.FRAMEWORKRECONCILIATION],Parent Data -->{Description=2029/03/07 09:01:11, Account Name=HRL\PUNEET, First Name=, Last Logon Timestamp=HRL1.CORP, Domain=Les, GUID=Hrl, Employee [email protected], Full Name=PUNEETKING, Mail=C7F1A30E-A0C7-444E-BA09-EC66E63831EB, Last Name=PUNEET}++DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.SCHEDULER.TASK],Class/Method: SchedulerBaseTask/isStopped entered.+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.SCHEDULER.TASK],Class/Method: SchedulerBaseTask/isStopped left.+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/getNextPage entered.+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/getNextPage - Data: page size--> - Value: -1+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],pagesize-->-1+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/end entered.+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Medthod: SharedDriveReconTransportProvider/end - Before calling: copyFilesToArchive(for Parent files)+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/copyFilesToArchive entered.+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/copyFilesToArchive - Data: src File--> - Value: LINUXRECON.COM.txt~+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],src File-->LINUXRECON.COM.txt~+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/copyFilesToArchive - Data: stage Dir--> - Value: /home/GTC+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],stageDir-->/home/GTC+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Class/Method: SharedDriveReconTransportProvider/copyFilesToArchive - Data: archive Dir--> - Value: /home/GTC/archive+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],archiveDir-->/home/GTC/archive+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],fis closed for /home/GTC/LINUXRECON.COM.txt~+
+DEBUG,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],fos closed for /home/GTC/archive/LINUXRECON.COM.txt~+
+INFO,24 Aug 2009 19:32:54,225,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],Info Data: file deleted --> - Value: true+
+WARN,24 Aug 2009 19:32:54,226,[XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT],FILE SUCCESSFULLY ARCHIVED : /home/GTC/LINUXRECON.COM.txt~+
Cheers,
Sunny
Edited by: sunny@newbie on Aug 24, 2009 7:58 PM -
Datasource flat files transport issue
Hi colleagues
I have created a datasource type flat files, when I transport this datasource in quality system the Datasource is not in active status but in modified status ?
Is there any particular action to do to be able to transport datasource ?
ThanksPlease check whether the datasource was active in development box when you transported the request. If not, activate it and then create a new request and transport it.
Maybe you are looking for
-
I would like to know how to use nested structure in a ABAP program. Thanks in Advance!
-
I created a webpage using the dreamweaver template when I open it in the browser in a maximize screen its perfect but when I adjust the size the content on the middle 'div' is going to the bottom at the page. How can I do it like if I resize the page
-
FGEN error -107411589​6
Can anyone please explain the error listed below? I'm using the PXI-5761 and the NI streaming API to stream float 32 IQ files. I've added code to the API to modulate the data while streaming. My options are no modulation, AM, or FM. I can stream the
-
Notifications not working?
Hi, I have an iPod Touch 3G. A couple days ago the notification center stopped working. Now I don't know when I have new fbook messages. It works for some apps (Mail), but not for the others, and they have been already updated and all. It was working
-
Where is boot camp 3.0?
I ran out of room on my main hard drive, so I erased it and RAIDed it with another hard drive to make a 2TB drive. Then I went into my windows 7 partition to play some games. THEN I went to the boot camp icon in the system tray to boot back into mac