Import data with nulls in Desktop Intelligence
Hello,
I have data with null values (in a measure). If I export the data as CSV it shows null values as #EMPTY. If I use that as a datasource, it loads #EMPTY as 0 (zero).
Is there a way to make desktop intelligence recognize null values? I tried to remove the #empty and just leave the column empty, it still loads as a numeric zero.
Same (actually worse) for XML. It seems if I export my data to XML it can not use that as a datasource. Odd. So I created the following XML file:
<?xml version='1.0' encoding='UTF-8'?>
<data>
<item key="AAAAAEUK" value="130" category="A" />
<item key="AAAACGDC" value="66" category="A" />
<item key="AAAADNJP" value="56" category="A" />
<item key="AAAAFWRQ" value="222" category="A" />
<item key="AAAAGITU" category="B" />
</data>
I would expect the value field to be null for the last row. But the values turns out to be 222! If I try to put anything else, an empty string "", "#EMPTY", "null", it is always returned as 0.
This is a problem because 0 an null are differnt when computing averages.
Hi Florian,
What is the source of your csv data?
If I have a table in excel with empty cells, the data goes into the .csv as nothing so your example would look like
Item key,value,category
AAAAAEUK,130,A
AAAACGDC,66,A
AAAADNJP,56,A
AAAAFWRQ,222,A
AAAAGITU,,B
This reads in just fine in DeskI (shows #EMPTY in the view data, treats it as such in the report).
So I would look to get the data generated in right way, then DeskI can handle it, no problem.
If all else fails, just run a search&replace and change ,#EMPTY, to ,,
BTW if you have an alphanumeric value in a column where DeskI already decided there should be numeric,
it will read in as 0, that is why the #EMPTY string value got transformed to zero.
Good luck,
Marianne
Similar Messages
-
How to compare date with null value
Hi
I have a date filed and i assigned that input box with context attribute of type Date.
Here my problem is
when the end user not entered any thing (null ) the i have give some defaul date.
so first in the action method i have to check the date with null
if it is null i have to give default date.
so please let me know how to over come this.
thanks
MukeshHi
You can get your date in your action method like
Date newDate=new Date();
Date myDate= wdThis
.wdGetYourComponentNameController()
.wdGetContext()
.currentYourNodeNameElement()
.getYourDateName();
if ( myDate== null) {
wdContext.currentContextElement().setYourDateName(newDate);
else{...........//continue your other validations or calling other methods}
Regards
Abhijith YS
Message was edited by:
Abhijith YS -
Data with "," overflows to the next field when importing data with DTW
Dear all,
When I tried to import data with DTW, for the data that includes ",", the data after "," all jumps into the next field after import.
E.g. Column A(memo): bank, United state
Column B(project code): blank
Result in SBO: Memo field : bank
Project code: United state
I have tried 2 ways.
1. Save the file as csv, and adding double quote(") at the beginning and ending of the data. However, it didn't work. It didn't work with single quote as well.
2. Save the file as txt, without amending the data at all.
It worked, however, in SBO, double quote is auto added, it becomes "bank, United state".
Please kindly advise how to handle such case.
Thank you.
Regards,
JulieHi,
Check this if it is useful :
Upload of BP Data using DTW
Rgds,
Jitin -
Reinstalled and Imported Data with Migration Assistant, What went wrong?
My MacBook HD recently went boom! After replacing it, I reinstalled the OS with the original DVD discs, a few steps later, I was offered to import data from an older computer so I plugged-in my FireWire portable disk with a CarbonCopy image of my old disk and proceeded with the import process.
After the Assistant finished, only a minor import error with Missing Sync was displayed. When the OS was completely installed, I tried opening some Office apps and Dreamweaver but none of them would open. So I tried to reinstall them but then I discovered that after the disk image from the DMG files was mounted, I was unable to open its associated finder window in order to copy or install the apps.
Does anyone knows why is this?
Also, on both the FireWire Disk and the MacBook fresh install, there are a few files and folders named Desktop-#. Why are there so many?The FireWire disk is a backup of the original HD that went bust on my MacBook. I don't know for sure the exact version of the OS (10.4.10 maybe) but it was Tiger for sure. My Macbook is a Core Duo 13.3-inch model.
The HD inside my MacBook now, is the original Tiger version that shipped with it but I'm in the process of downloading and installing all issued updates. I re-erased all the contents again, so now my laptop is fresh again.
Any guiding-light as to how to transfer the info on the FireWire back to the laptop?
BTW I can boot my machine from the FireWire disk and its works without problems, except for the office apps that don't launch. -
How to export/import data with different nls_lang???
hello
now i try to run my forms by APPLICATION SERVER to view the forms as web forms i success but data appear unreadable that it is arabic and i use the nls WE8ISO8859P1 so i change the nls to arabic like AR8MSWIN1256 but i faild so i create new database with charset UTF8 and nls_lang is AR8MSWIN1256 and try to insert new data when i do query about data ......the new data appear readable but the old unreadable so what can i do???
i think that if i do export data and import it with new (charset and nls_lang) it will be success....but how can i do it??
thank u
regardsHi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
for importing data into oracle tables using sql*loader use below steps
1) create a sql*loader control file.
it looks like as follows
LOAD DATA
INFILE 'sample.dat'
BADFILE 'sample.bad'
DISCARDFILE 'sample.dsc'
APPEND
INTO TABLE emp
TRAILING NULLCOLS
or for sample script of control file search google.
2) at command prompt issue following
$ sqlldr test/test
enter control file=<give control file name which you create earlier>
debug any errors (if occured) -
Error while importing data with DTW
Hi Guys,
I received an error when i try to upload data with DTW
method '~' of object '~' failed65171
When i click on run simulation again,
I received another error 65172.
There's two different error.
Can anyone help?Hi,
This error means that your Obscommon user is corrupt and you cannot connect to the SBO-Common database to download the OBServer.dll that is stored in the temporary directory.
For more information, see SAP Note 858475, 642564, or 614946.
Thanks & Regards,
Nagarajan -
Importing data with characterstics/Merchandise hierarchy in SAP IS Retail
Hi All,
Im importing data into SAP IS Retail from MDM using ARTMAS04 message type using SAP XI. I have cetrain Merchandise hierarchy,Characterstics maintained in ECC as well as in MDM. The article category is "Generic Article". Im able to pass the valuse of characterstics and its values to ECC but idoc is not getting posted and gives an error msg" No Characterstics could be found for generic articles".
In a normal article creation in ECC using MM41 we have to create a variant for generic articles by passing characterstics and its values. How do i create a variant when im fetching data from MDM in XML format. Is there some setting required on ECC side.
Please let me know in case anyone has any clue.
Thanks
manishnot resolved
-
1.5.1: Excel import - Problem with NULL values in date fileds
Hi,
I'm trying to import Excel Data via CSV with the import into table feature of SQLDeveloper.
Everything works fine, but if I select some of my date colums I get an error telling me that I have null or invalid values in a date column:
Verifying if the Date columns have date formats
FAILED
Date columns SOA_CONTROLLING_NO, SOA_DATE_OF_CONTROLLING, PAYOUT_ME_2_SUM, PAYOUT_ME_1_SUM, PAYOUT_PLUS_SUM, PAYOUT_6_SUM, PAYOUT_5_SUM, PAYOUT_4_SUM, PAYOUT_3_SUM, PAYOUT_2_SUM, PAYOUT_1_SUM, CONTRACT_APPENDIX_SUM, CONTRACT_BASE_SUM, ME_CONVENANT_SUM, REF_SUM_NO_REPAY, SEL_CONVENANT_SUM, ME_REQ_SUM, REF_REQ_SUM_NO_REPAY, SEL_REQ_SUM, have invalid or null date formats
I tried:
+20012121313,demo1,,+
+20012121313,demo1,NULL,+
+20012121313,demo1,to_date(NULL),+
but none of them worked...
How may I import those columns, where some of the rows contain NULL values?
Same story using 1.5... ;-(
Thank you!
Best regards,
JohannLook at the query Andy proposed:
SELECT NULL LINK, start_date, NVL(sum_of_trans, 0) + 0.001 VALUE
FROM (SELECT TRUNC (startdate) start_date,
CASE
WHEN :p9_view_by = 1
THEN ROUND (DURATION / 60, 3)
ELSE 1
END sum_of_trans
FROM TRANSACTION
WHERE trans_id = x AND startdate BETWEEN y AND z
GROUP BY TRUNC (startdate)) xDoes this query deliver anything?
Denes Kubicek -
Compare two dates with NULL in one
I am trying to do the following with a simple SQL. Compare two dates and get the latest one, but the greatest() function always returns NULL if NULL is present. So how can I do it ?
Date1 Date2 Desired Result
Null 01-Dec-09 01-Dec-09
01-May-09 01-Mar-09 01-May-09
01-May-09 NULL 01-May-09
01-May-09 01-Nov-09 01-Nov-09
NULL NULL NULL
Any suggestion ? ThanksHi,
Try this,
create table test1
fdate date,
tdate date
insert into test1 values (null,'25-jan-2010');
select greatest(nvl(fdate,tdate),nvl(tdate,fdate)) greatest from test1;
Thanks&Regards,
Jai -
Hi,
Has anyone tried loading data records to HANA DB 1.00 Build 20 from the CTL files provided in https://wiki.wdf.sap.corp/wiki/display/HANADemoenvironment/HowtoloadCOPAdata ?
It was work on HANA DB 1.00 Build 15 but not with New Build...
The procedure was exactly the same
The creation of Tables is done but i have this error with import data with each csv files:
Command with HANA Studio (Build 20):
load from '/tmp/export/TCURT.ctl' threads 4 batch 1000;
Error:
Could not execute 'load from '/tmp/export/TCURT.ctl' threads 4 batch 1000'
SAP DBTech JDBC: [257]: sql syntax error: line 1 (at pos 3643214585)
I tried also: load from '/tmp/export/TCURT.ctl' -> same error...
We have all privileges on the tmp folder and files
I tried also to copy new data from source...
Do you have an idea?
Thanks for advance for help .
Best regards,
Eric MGHello,
LOAD TABLE command was not public - full syntax was never mentioned in any SAP guide on help.sap.com or Market Place (except SAP HANA Pocketbook-DRAFT). Therefore it is SAP internal syntax that can be changed anytime. And I guess this is what happened. Command is either deprecated or having different syntax.
If you need to import CVS file I would suggest to use official way - either IMPORT command (which was also not very public before but since SP03 it is mentioned in SQL Script Guide) or best using SAP HANA Studio import functionality (just be sure you use CVS as type of file).
You can find explanation of both ways in this blog:
SAP HANA - Modeling Content Migration - part 2: Import, Post-migration Activities
Tomas -
How to export data with column headers in sql server 2008 with bcp command?
Hi all,
I want know "how to export data with column headers in sql server 2008 with bcp command", I know how to import data with import and export wizard. when i
am trying to import data with bcp command data has been copied but column names are not came.
I am using the below query:-
EXEC master..xp_cmdshell
'BCP "SELECT * FROM [tempdb].[dbo].[VBAS_ErrorLog] " QUERYOUT "D:\Temp\SQLServer.log" -c -t , -T -S SERVER-A'
Thanks,
SAAD.Hi All,
I have done as per your suggestion but here i have face the below problem, in print statment it give correct query, in EXEC ( EXEC master..xp_cmdshell @BCPCMD) it was displayed error message like below
DECLARE @BCPCMD
nvarchar(4000)
DECLARE @BCPCMD1
nvarchar(4000)
DECLARE @BCPCMD2
nvarchar(4000)
DECLARE @SQLEXPRESS
varchar(50)
DECLARE @filepath
nvarchar(150),@SQLServer
varchar(50)
SET @filepath
= N'"D:\Temp\LDH_SQLErrorlog_'+CAST(YEAR(GETDATE())
as varchar(4))
+RIGHT('00'+CAST(MONTH(GETDATE())
as varchar(2)),2)
+RIGHT('00'+CAST(DAY(GETDATE())
as varchar(2)),2)+'.log" '
Set @SQLServer
=(SELECT
@@SERVERNAME)
SELECT @BCPCMD1
= '''BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT '
SELECT @BCPCMD2
= '-c -t , -T -S '
+ @SQLServer +
SET @BCPCMD
= @BCPCMD1+ @filepath
+ @BCPCMD2
Print @BCPCMD
-- Print out below
'BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername'
EXEC
master..xp_cmdshell
@BCPCMD
''BCP' is not recognized as an internal or external command,
operable program or batch file.
NULL
if i copy the print ourt put like below and excecute the CMD it was working fine, could you please suggest me what is the problem in above query.
EXEC
master..xp_cmdshell
'BCP "SELECT * FROM
[tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername '
Thanks, SAAD. -
Beginner trying to import data from GL_interface to GL
Hello I’m Beginner with Oracle GL and I ‘m not able to do an import from GL_Interface, I have putted my data into GL_interface but I think that something is wrong with it Becose when I try to import data
with Journals->Import->run oracle answer me that GL_interface is empty!
I think that maybe my insert is not correct and that's why oracle don't want to use it... can someone help me?
----> I have put the data that way:
insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
entered_dr, entered_cr,transaction_date,reference1 )
values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','4004',111.11,0,
sysdate,'xx_finance_jab_demo');
insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
entered_dr, entered_cr,transaction_date,reference1 )
values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','1005',0,111.11,
sysdate,'xx_finance_jab_demo');
------------> Oracle send me that message:
General Ledger: Version : 11.5.0 - Development
Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
GLLEZL module: Journal Import
Current system time is 14-MAR-2007 15:39:25
Running in Debug Mode
gllsob() 14-MAR-2007 15:39:25sob_id = 124
sob_name = Vision France
coa_id = 50569
num_segments = 6
delim = '.'
segments =
SEGMENT1
SEGMENT2
SEGMENT3
SEGMENT4
SEGMENT5
SEGMENT6
index segment is SEGMENT2
balancing segment is SEGMENT1
currency = EUR
sus_flag = Y
ic_flag = Y
latest_opened_encumbrance_year = 2006
pd_type = Month
<< gllsob() 14-MAR-2007 15:39:25
gllsys() 14-MAR-2007 15:39:25fnd_user_id = 1008009
fnd_user_name = JAB-DEVELOPPEUR
fnd_login_id = 2675718
con_request_id = 2918896
sus_on = 0
from_date =
to_date =
create_summary = 0
archive = 0
num_rec = 1000
num_flex = 2500
run_id = 55578
<< gllsys() 14-MAR-2007 15:39:25
SHRD0108: Retrieved 51 records from fnd_currencies
gllcsa() 14-MAR-2007 15:39:25<< gllcsa() 14-MAR-2007 15:39:25
gllcnt() 14-MAR-2007 15:39:25SHRD0118: Updated 1 record(s) in table: gl_interface_control
source name = xx jab payroll
group id = -1
LEZL0001: Found 1 sources to process.
glluch() 14-MAR-2007 15:39:25<< glluch() 14-MAR-2007 15:39:25
gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26<< gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26
glusbe() 14-MAR-2007 15:39:26<< glusbe() 14-MAR-2007 15:39:26
<< gllcnt() 14-MAR-2007 15:39:26
gllpst() 14-MAR-2007 15:39:26SHRD0108: Retrieved 110 records from gl_period_statuses
<< gllpst() 14-MAR-2007 15:39:27
glldat() 14-MAR-2007 15:39:27Successfully built decode fragment for period_name and period_year
gllbud() 14-MAR-2007 15:39:27SHRD0108: Retrieved 10 records from the budget tables
<< gllbud() 14-MAR-2007 15:39:27
gllenc() 14-MAR-2007 15:39:27SHRD0108: Retrieved 15 records from gl_encumbrance_types
<< gllenc() 14-MAR-2007 15:39:27
glldlc() 14-MAR-2007 15:39:27<< glldlc() 14-MAR-2007 15:39:27
gllcvr() 14-MAR-2007 15:39:27SHRD0108: Retrieved 6 records from gl_daily_conversion_types
<< gllcvr() 14-MAR-2007 15:39:27
gllfss() 14-MAR-2007 15:39:27LEZL0005: Successfully finished building dynamic SQL statement.
<< gllfss() 14-MAR-2007 15:39:27
gllcje() 14-MAR-2007 15:39:27main_stmt:
select int.rowid
decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
, '', replace(ccid_cc.SEGMENT2,'.','
') || '.' || replace(ccid_cc.SEGMENT1,'.','
') || '.' || replace(ccid_cc.SEGMENT3,'.','
') || '.' || replace(ccid_cc.SEGMENT4,'.','
') || '.' || replace(ccid_cc.SEGMENT5,'.','
') || '.' || replace(ccid_cc.SEGMENT6,'.','
, replace(int.SEGMENT2,'.','
') || '.' || replace(int.SEGMENT1,'.','
') || '.' || replace(int.SEGMENT3,'.','
') || '.' || replace(int.SEGMENT4,'.','
') || '.' || replace(int.SEGMENT5,'.','
') || '.' || replace(int.SEGMENT6,'.','
') ) flexfield , nvl(flex_cc.code_combination_id,
nvl(int.code_combination_id, -4))
, decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
, '', decode(ccid_cc.code_combination_id,
null, decode(int.code_combination_id, null, -4, -5),
decode(sign(nvl(ccid_cc.start_date_active, int.accounting_date-1)
- int.accounting_date),
1, -1,
decode(sign(nvl(ccid_cc.end_date_active, int.accounting_date +1)
- int.accounting_date),
-1, -1, 0)) +
decode(ccid_cc.enabled_flag,
'N', -10, 0) +
decode(ccid_cc.summary_flag, 'Y', -100,
decode(int.actual_flag,
'B', decode(ccid_cc.detail_budgeting_allowed_flag,
'N', -100, 0),
decode(ccid_cc.detail_posting_allowed_flag,
'N', -100, 0)))),
decode(flex_cc.code_combination_id,
null, -4,
decode(sign(nvl(flex_cc.start_date_active, int.accounting_date-1)
- int.accounting_date),
1, -1,
decode(sign(nvl(flex_cc.end_date_active, int.accounting_date +1)
- int.accounting_date),
-1, -1, 0)) +
decode(flex_cc.enabled_flag,
'N', -10, 0) +
decode(flex_cc.summary_flag, 'Y', -100,
decode(int.actual_flag,
'B', decode(flex_cc.detail_budgeting_allowed_flag,
'N', -100, 0),
decode(flex_cc.detail_posting_allowed_flag,
'N', -100, 0)))))
, int.user_je_category_name
, int.user_je_category_name
, 'UNKNOWN' period_name
, decode(actual_flag, 'B'
, decode(period_name, NULL, '-1' ,period_name), nvl(period_name, '0')) period_name2
, currency_code
, decode(actual_flag
, 'A', actual_flag
, 'B', decode(budget_version_id
, 1210, actual_flag
, 1211, actual_flag
, 1212, actual_flag
, 1331, actual_flag
, 1657, actual_flag
, 1658, actual_flag
, NULL, '1', '6')
, 'E', decode(encumbrance_type_id
, 1000, actual_flag
, 1001, actual_flag
, 1022, actual_flag
, 1023, actual_flag
, 1024, actual_flag
, 1048, actual_flag
, 1049, actual_flag
, 1050, actual_flag
, 1025, actual_flag
, 999, actual_flag
, 1045, actual_flag
, 1046, actual_flag
, 1047, actual_flag
, 1068, actual_flag
, 1088, actual_flag
, NULL, '3', '4'), '5') actual_flag
, '0' exception_rate
, decode(currency_code
, 'EUR', 1
, 'STAT', 1
, decode(actual_flag, 'E', -8, 'B', 1
, decode(user_currency_conversion_type
, 'User', decode(currency_conversion_rate, NULL, -1, currency_conversion_rate)
,'Corporate',decode(currency_conversion_date,NULL,-2,-6)
,'Spot',decode(currency_conversion_date,NULL,-2,-6)
,'Reporting',decode(currency_conversion_date,NULL,-2,-6)
,'HRUK',decode(currency_conversion_date,NULL,-2,-6)
,'DALY',decode(currency_conversion_date,NULL,-2,-6)
,'HLI',decode(currency_conversion_date,NULL,-2,-6)
, NULL, decode(currency_conversion_rate,NULL,
decode(decode(nvl(to_char(entered_dr),'X'),'X',1,2),decode(nvl(to_char(accounted_dr),'X'),'X',1,2),
decode(decode(nvl(to_char(entered_cr),'X'),'X',1,2),decode(nvl(to_char(accounted_cr),'X'),'X',1,2),-20,-3),-3),-9),-9))) currency_conversion_rate
, to_number(to_char(nvl(int.currency_conversion_date, int.accounting_date), 'J'))
, decode(int.actual_flag
, 'A', decode(int.currency_code
, 'EUR', 'User'
, 'STAT', 'User'
, nvl(int.user_currency_conversion_type, 'User'))
, 'B', 'User', 'E', 'User'
, nvl(int.user_currency_conversion_type, 'User')) user_currency_conversion_type
, ltrim(rtrim(substrb(rtrim(substrb(int.reference1, 1, 50)) || ' ' || int.user_je_source_name || ' 2918896: ' || int.actual_flag || ' ' || int.group_id, 1, 100)))
, rtrim(substrb(nvl(rtrim(int.reference2), 'Journal Import ' || int.user_je_source_name || ' 2918896:'), 1, 240))
, ltrim(rtrim(substrb(rtrim(rtrim(substrb(int.reference4, 1, 25)) || ' ' || int.user_je_category_name || ' ' || int.currency_code || decode(int.actual_flag, 'E', ' ' || int.encumbrance_type_id, 'B', ' ' || int.budget_version_id, '') || ' ' || int.user_currency_conversion_type || ' ' || decode(int.user_currency_conversion_type, NULL, '', 'User', to_char(int.currency_conversion_rate), to_char(int.currency_conversion_date))) || ' ' || substrb(int.reference8, 1, 15) || int.originating_bal_seg_value, 1, 100)))
, rtrim(nvl(rtrim(int.reference5), 'Journal Import 2918896:'))
, rtrim(substrb(nvl(rtrim(int.reference6), 'Journal Import Created'), 1, 80))
, rtrim(decode(upper(substrb(nvl(rtrim(int.reference7), 'N'), 1, 1)),'Y','Y', 'N'))
, decode(upper(substrb(int.reference7, 1, 1)), 'Y', decode(rtrim(reference8), NULL, '-1', rtrim(substrb(reference8, 1, 15))), NULL)
, rtrim(upper(substrb(int.reference9, 1, 1)))
, rtrim(nvl(rtrim(int.reference10), nvl(to_char(int.subledger_doc_sequence_value), 'Journal Import Created')))
, int.entered_dr
, int.entered_cr
, to_number(to_char(int.accounting_date,'J'))
, to_char(int.accounting_date, 'YYYY/MM/DD')
, int.user_je_source_name
, nvl(int.encumbrance_type_id, -1)
, nvl(int.budget_version_id, -1)
, NULL
, int.stat_amount
, decode(int.actual_flag
, 'E', decode(int.currency_code, 'STAT', '1', '0'), '0')
, decode(int.actual_flag
, 'A', decode(int.budget_version_id
, NULL, decode(int.encumbrance_type_id, NULL, '0', '1')
, decode(int.encumbrance_type_id, NULL, '2', '3'))
, 'B', decode(int.encumbrance_type_id
, NULL, '0', '4')
, 'E', decode(int.budget_version_id
, NULL, '0', '5'), '0')
, int.accounted_dr
, int.accounted_cr
, nvl(int.group_id, -1)
, nvl(int.average_journal_flag, 'N')
, int.originating_bal_seg_value
from GL_INTERFACE int,
gl_code_combinations flex_cc,
gl_code_combinations ccid_cc
where int.set_of_books_id = 124
and int.status != 'PROCESSED'
and (int.user_je_source_name,nvl(int.group_id,-1)) in (('xx jab payroll', -1))
and flex_cc.SEGMENT1(+) = int.SEGMENT1
and flex_cc.SEGMENT2(+) = int.SEGMENT2
and flex_cc.SEGMENT3(+) = int.SEGMENT3
and flex_cc.SEGMENT4(+) = int.SEGMENT4
and flex_cc.SEGMENT5(+) = int.SEGMENT5
and flex_cc.SEGMENT6(+) = int.SEGMENT6
and flex_cc.chart_of_accounts_id(+) = 50569
and flex_cc.template_id(+) is NULL
and ccid_cc.code_combination_id(+) = int.code_combination_id
and ccid_cc.chart_of_accounts_id(+) = 50569
and ccid_cc.template_id(+) is NULL
order by decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
, rpad(ccid_cc.SEGMENT2,30) || '.' || rpad(ccid_cc.SEGMENT1,30) || '.' || rpad(ccid_cc.SEGMENT3,30) || '.' || rpad(ccid_cc.SEGMENT4,30) || '.' || rpad(ccid_cc.SEGMENT5,30) || '.' || rpad(ccid_cc.SEGMENT6,30)
, rpad(int.SEGMENT2,30) || '.' || rpad(int.SEGMENT1,30) || '.' || rpad(int.SEGMENT3,30) || '.' || rpad(int.SEGMENT4,30) || '.' || rpad(int.SEGMENT5,30) || '.' || rpad(int.SEGMENT6,30)
) , int.entered_dr, int.accounted_dr, int.entered_cr, int.accounted_cr, int.accounting_date
control->len_mainsql = 16402
length of main_stmt = 7428
upd_stmt.arr:
update GL_INTERFACE
set status = :status
, status_description = :description
, je_batch_id = :batch_id
, je_header_id = :header_id
, je_line_num = :line_num
, code_combination_id = decode(:ccid, '-1', code_combination_id, :ccid)
, accounted_dr = :acc_dr
, accounted_cr = :acc_cr
, descr_flex_error_message = :descr_description
, request_id = to_number(:req_id)
where rowid = :row_id
upd_stmt.len: 394
ins_stmt.arr:
insert into gl_je_lines
( je_header_id, je_line_num, last_update_date, creation_date, last_updated_by, created_by , set_of_books_id, code_combination_id ,period_name, effective_date , status , entered_dr , entered_cr , accounted_dr , accounted_cr , reference_1 , reference_2
, reference_3 , reference_4 , reference_5 , reference_6 , reference_7 , reference_8 , reference_9 , reference_10 , description
, stat_amount , attribute1 , attribute2 , attribute3 , attribute4 , attribute5 , attribute6 ,attribute7 , attribute8
, attribute9 , attribute10 , attribute11 , attribute12 , attribute13 , attribute14, attribute15, attribute16, attribute17
, attribute18 , attribute19 , attribute20 , context , context2 , context3 , invoice_amount , invoice_date , invoice_identifier
, tax_code , no1 , ussgl_transaction_code , gl_sl_link_id , gl_sl_link_table , subledger_doc_sequence_id , subledger_doc_sequence_value
, jgzz_recon_ref , ignore_rate_flag)
SELECT
:je_header_id , :je_line_num , sysdate , sysdate , 1008009 , 1008009 , 124 , :ccid , :period_name
, decode(substr(:account_date, 1, 1), '-', trunc(sysdate), to_date(:account_date, 'YYYY/MM/DD'))
, 'U' , :entered_dr , :entered_cr , :accounted_dr , :accounted_cr
, reference21, reference22, reference23, reference24, reference25, reference26, reference27, reference28, reference29
, reference30, :description, :stat_amt, '' , '', '', '', '', '', '', '', '' , '', '', '', '', '', '', '', '', '', '', ''
, '', '', '', '', '', '', '', '', '', gl_sl_link_id
, gl_sl_link_table
, subledger_doc_sequence_id
, subledger_doc_sequence_value
, jgzz_recon_ref
, null
FROM GL_INTERFACE
where rowid = :row_id
ins_stmt.len: 1818
glluch() 14-MAR-2007 15:39:27<< glluch() 14-MAR-2007 15:39:27
LEZL0008: Found no interface records to process.
LEZL0009: Check SET_OF_BOOKS_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.
If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved. Note that most data
from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
SHRD0119: Deleted 1 record(s) from gl_interface_control.
Start of log messages from FND_FILE
End of log messages from FND_FILE
Executing request completion options...
Finished executing request completion options.
No data was found in the GL_INTERFACE table.
Concurrent request completed
Current system time is 14-MAR-2007 15:39:27
---------------------------------------------------------------------------As per the error message said, you need to specify a group id.
as per documentation :
GROUP_ID: Enter a unique group number to distinguish import data within a
source. You can run Journal Import in parallel for the same source if you specify a
unique group number for each request.
For example if you put data for payables and receivables, you need to put different group id to separate payables and receivables data.
HTH -
Import data from a ooo.base to Derby
Hi,
I need to import data from a ooo.base to Derby.
I am not an expert in this kind of work, but:
- I prepared the data in a spreadsheet, exporting them from the ooo.base.
- And I read in the documentation that Derby allow to import data with the statement:
SYSCS_UTIL.SYSCS_IMPORT_DATA(.....);Now I am trying to do things in this way, but I am not able to write correctly the statement above.
I tried it in some different way and always :
exmples:
CALL SYSCS_UTIL.SYSCS_IMPORT_DATA(NULL, 'STAFF', null, '1,3,4', 'data.del', null, null, null,0);
CALL(" SYSCS_UTIL.SYSCS_IMPORT_DATA(NULL, 'STAFF', null, '1,3,4', 'data.del', null, null, null,0)");but the NetBeans IDE evidence shows that this statement is with error of syntax_.
I spent much time to understand how to write this SQL procedure, so I would like some help to be able to do it.
Thank you in advance
Regards
Angelo MoreschiniHi jshell,
it is nice to hear from you again.
After I read your suggestion I looked better on the manuals of Derby.
Derby has such tools that you supposed they have; they are named ij, but they work from the command line and they are not easy to use.
The procedure to import/export data of Derby can be used from the ij utility or inside a Java program.
The procedure to import and export data of Derby are in the file derbytools.jar, and this file have to be in the classpath if you want to use them.
Putting derbytools.jar in the classpath I could execute the code that I post down (to benefit of others that can be interested) without errors:
PreparedStatement ps = connection.prepareStatement("CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE (?,?,?,?,?,?)");
ps.setString(1, null);
ps.setString(2, "STAFF");
ps.setString(3, "staff.dat");
ps.setString(4, "%");
ps.setString(5, null);
ps.setString(6, null);
ps.execute();Thank you for the advice
regards
Angelo. -
I have a problem.
Could you help me?
I have two tables 'customers' and 'customers_old' there are the same columns and I'll have import data with 'customers_old' to 'customers'
How to do it?
thxI have a problem.
How import data with table BAEVENT_OLD to BAEVENT ?
I would like to use command SQL.
-- DDL for Table BAEVENT
CREATE TABLE "BA_SYSTEM"."BAEVENT"
( "ID" NUMBER(19,0),
"DESCRIPTION" VARCHAR2(255 BYTE),
"TIMESTAMP" DATE,
"PRINCIPAL" VARCHAR2(255 BYTE),
"BAOBJECT" VARCHAR2(255 BYTE),
"NAMESPACE" VARCHAR2(255 BYTE),
"RESULT" VARCHAR2(255 BYTE),
"CHANNEL" VARCHAR2(255 BYTE),
"REQUEST_ID" NUMBER(19,0),
"ARCHIVE_ID" NUMBER(19,0)
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ;
-- DDL for Index SYS_C0020270
CREATE UNIQUE INDEX "BA_SYSTEM"."SYS_C0020270" ON "BA_SYSTEM"."BAEVENT" ("ID")
PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ;
-- Constraints for Table BAEVENT
ALTER TABLE "BA_SYSTEM"."BAEVENT" MODIFY ("ID" NOT NULL ENABLE);
ALTER TABLE "BA_SYSTEM"."BAEVENT" ADD PRIMARY KEY ("ID")
USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ENABLE;
-- Ref Constraints for Table BAEVENT
ALTER TABLE "BA_SYSTEM"."BAEVENT" ADD CONSTRAINT "FK1649C99B631EE188" FOREIGN KEY ("ARCHIVE_ID")
REFERENCES "BA_SYSTEM"."BAEVENTARCHIVE" ("ID") ENABLE;
***********************************************************************************************************************************************8
-- DDL for Table BAEVENT_OLD
CREATE TABLE "BA_SYSTEM"."BAEVENT_OLD"
( "ID" NUMBER(19,0),
"BAOBJECT" VARCHAR2(255 BYTE),
"CHANNEL" VARCHAR2(255 BYTE),
"DESCRIPTION" VARCHAR2(255 BYTE),
"NAMESPACE" VARCHAR2(255 BYTE),
"PRINCIPAL" VARCHAR2(255 BYTE),
"RESULT" VARCHAR2(255 BYTE),
"TIMESTAMP" DATE,
"REQUEST_ID" NUMBER(19,0)
) PCTFREE 10 PCTUSED 0 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ENABLE ROW MOVEMENT ;
-- DDL for Index SYS_C007023
CREATE UNIQUE INDEX "BA_SYSTEM"."SYS_C007023" ON "BA_SYSTEM"."BAEVENT_OLD" ("ID")
PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ;
-- Constraints for Table BAEVENT_OLD
ALTER TABLE "BA_SYSTEM"."BAEVENT_OLD" MODIFY ("ID" NOT NULL ENABLE);
ALTER TABLE "BA_SYSTEM"."BAEVENT_OLD" ADD PRIMARY KEY ("ID")
USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "BA_SYSTEM_DAT" ENABLE;
-- Ref Constraints for Table BAEVENT_OLD
ALTER TABLE "BA_SYSTEM"."BAEVENT_OLD" ADD CONSTRAINT "FK1649C99B495C1452" FOREIGN KEY ("REQUEST_ID")
REFERENCES "BA_SYSTEM"."BAREQUEST_OLD" ("ID") ENABLE; -
HOW TO IMPORT DATA MORE THAN ONCE FROM THE SAME EXPORT DUMP FILE?
before asking my question i'd like to mention that i'm a french spoke...
so my english is a little bit bad. sorry for that.
my problem is : IMPORT
how to import data a SECOND TIME from an export dump file within oracle?
My Export dump file was made successfully (Full Export) and then i
tried to import datas for the first time.
I got this following message in my logfile: I ADDED SOME COMMENTS
Warning: the objects were exported by L1, not by you
. importing SYSTEM's objects into SYSTEM
REM ************** CREATING TABLESPACES *****
REM *********************************************
IMP-00015: following statement failed because the object already exists:
"CREATE TABLESPACE "USER_DATA" DATAFILE 'E:\ORANT\DATABASE\USR1ORCL.ORA' SI"
"ZE 3145728 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAX"
"EXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
IMP-00015: following statement failed because the object already exists:
"CREATE TABLESPACE "ROLLBACK_DATA" DATAFILE 'E:\ORANT\DATABASE\RBS1ORCL.ORA"
"' SIZE 10485760 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS "
"1 MAXEXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
etc........
IMP-00017: following statement failed with ORACLE error 1119:
"CREATE TABLESPACE "L1" DATAFILE 'E:\ORADATA\L1.DBF' SIZE 1048576000 "
"DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAXEXTENTS 121 PCTIN"
"CREASE 50) ONLINE PERMANENT"
IMP-00003: ORACLE error 1119 encountered
ORA-01119: error in creating database file 'E:\ORADATA\L1.DBF'
ORA-09200: sfccf: error creating file
OSD-04002: unable to open file
O/S-Error: (OS 3) The system cannot find the path specified
--->etc..........
the drive E: with the folder E:\ORADATA didn't exist, but after
all that i created it.
see below, before my IMPORT statement
REM ********************* CREATING USER *********
REM ********************************************
IMP-00017: following statement failed with ORACLE error 959:
"CREATE USER "L1" IDENTIFIED BY VALUES 'A6E0DAA6865E7627' DEFAULT TABLESPACE"
" "L1" TEMPORARY TABLESPACE "TEMPORARY_DATA""
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'L1' does not exist
IMP-00017: following statement failed with ORACLE error 959:
"CREATE USER "MLCO" IDENTIFIED BY VALUES '56AC6447B7D50467' DEFAULT TABLESPA"
"CE "MLCO" TEMPORARY TABLESPACE "TEMPORARY_DATA""
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'MLCO' does not exist
ETC.......
REM ********************* GRANTING ROLES ***********
REM ************************************************
IMP-00017: following statement failed with ORACLE error 1917:
"GRANT ALTER ANY TABLE to "L1" "
IMP-00003: ORACLE error 1917 encountered
ORA-01917: user or role 'L1' does not exist
ETC.........
IMP-00017: following statement failed with ORACLE error 1918:
"ALTER USER "L1" DEFAULT ROLE ALL"
IMP-00003: ORACLE error 1918 encountered
ORA-01918: user 'L1' does not exist
-- that is normal, since the creation of the
tablespace failed !!
REM******************************
IMP-00015: following statement failed because the object already exists:
"CREATE ROLLBACK SEGMENT RB_TEMP STORAGE (INITIAL 10240 NEXT 10240 MINEXTENT"
"S 2 MAXEXTENTS 121) TABLESPACE "SYSTEM""
IMP-00015: following statement failed because
. importing SCOTT's objects into SCOTT
IMP-00015: following statement failed because the object already exists:
"CREATE SEQUENCE "EVT_PROFILE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999"
"999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE"
ETC............
importing L1's objects into L1
IMP-00017: following statement failed with ORACLE error 1435:
"ALTER SCHEMA = "L1""
IMP-00003: ORACLE error 1435 encountered
ORA-01435: user does not exist
REM *************** IMPORTING TABLES *******************
REM ****************************************************
. importing SYSTEM's objects into SYSTEM
. . importing table "AN1999_BDAT" 243 rows imported
. . importing table "BOPD" 112 rows imported
. . importing table "BOINFO_AP" 49
ETC................
. . importing table "BO_WHF" 2 rows imported
IMP-00015: following statement failed because the object already exists:
"CREATE TABLE "DEF$_CALL" ("DEFERRED_TRAN_DB" VARCHAR2(128),
IMP-00015: following statement failed because the object already exists:
"CREATE SYNONYM "DBA_ROLES" FOR "SYS"."DBA_ROLES""
IMP-00015: following statement failed because the object already exists:
"CREATE SYNONYM "DBA_ERRORS" FOR "SYS"."DBA_ERRORS""
IMP-00008: unrecognized statement in the export file:
. importing L1's objects into L1
IMP-00017: following statement failed with ORACLE error 1435:
"ALTER SCHEMA = "L1""
IMP-00008: unrecognized statement in the export file:
J
Import terminated successfully with warnings.
-------------------------------------b]
So after analysing this log file, i created
the appropriate drives and folders... as the
import statement doesn't see them.
E:\ORADATA G:\ORDATA etc...
And i started to [b]IMPORT ONE MORE TIME. with:
$ IMP73 sys/pssw Full=Y FILE=c:\temp\FOLD_1\data_1.dmp BUFFER=64000
COMMIT=Y INDEXFILE=c:\temp\FOLD_1\BOO_idx.sql
LOG=c:\temp\FOLD_1\BOO_log.LOG DESTROY=Y IGNORE=Y;
after that i could not see the users nor the
tables created.
and the following message appeared in the log file:
Warning: the objects were exported by L1, not by you
. . skipping table "AN1999_BDAT"
. . skipping table "ANPK"
. . skipping table "BOAP"
. . skipping table "BOO_D"
ETC.....skipping all the tables
. . skipping table "THIN_PER0"
. . skipping table "UPDATE_TEMP"
Import terminated successfully without warnings.
and only 2 new tablespaces (originally 3) were
created without any data in ( i check that in
the Oracle Storage manager : the tablespaces exit
with 0.002 used space; originally 60 M for each !!)
so,
How to import data (with full import option) succefully
MORE THAN ONE TIME from an exported dump file ?
Even if we have to overwrite tablespaces , tables and users.
thank you very muchThe Member Feedback forum is for suggestions and feedback for OTN Developer Services. This forum is not monitored by Oracle support or product teams and so Oracle product and technology related questions will not be answered. We recommend that you post this thread to the appropriate Database forum.
The main URL is:
http://forums.oracle.com/forums/index.jsp?cat=18
Maybe you are looking for
-
How Can I set my speed of clip to 0%?
How Can I set my speed of clip to 0%? I can't do that. A minimum value is: 1.00
-
i just got an iphone and set up my apple account (i am on bill pay). i want to download free apps. when i put in my account it says that my apple id has not yet been used in the itunes store and i should review this. when i follow this it is asking m
-
Update Price Condition Function Module
I'm trying to identify the function or bapi that is used to update the pricing conditions for a line item in a purchase order. This is done manually by clicking the UPDATE button on the screen when in the PO item conditions view. Specifically this FM
-
Purchase organization - naming convention
Hi, Is there any best practice for the Purchase organization - naming convention in a global template project? We have regional purchase organizations and country specific purchase organiozation in global template. Regards, AG
-
Why is the new email count inaccurate?
Mail is not accurately reporting how many new emails are available. For example, I can read all of my new email, and it will still report that I have unread emails. Also, if I do have unread emails, it won't display that is the case. I notice the i