IC Dataloading Issue
Hi, I had made a new cube ZIC_C04 which only has a few KFs and dimensions that i required in a report.
I didn't use IC_C03 because i had to put in some KFs and one dimension from the master data(which again was modified).The cube is being updated by 2LIS_03_BF only.
Now the issue is that when i am loading data into the cube the request monitor shows that it has completed successfully and sets the status to Green but the request itself remains yellow when i click on manage to see the status of the Request inside the cube.
Plz Let me know how to take care of this issue.
Hi Prasun,
Check whether your Infocube settings has automatic request processing ticked for Set quality status OK.
Infocube> manage> Environment-->Automatic Request Processing.
Hope this helps.
Regards,
Siva.
Similar Messages
-
Hi,
We are facing an issue with ASO dataload.
When we load '0' values in ASO, we see them as #Missing in the smartview and in database->Preview data.This is impacting our formulas.
We did not replace '0' with #Missing in our rule files.
This works fine with BSO where we see '0' when the value is 0 and the rest as #Missing.
Can I get any help on this as what could have gone wrong..
Thanks in advance
Edited by: 844747 on Nov 23, 2012 6:14 AMIn one sense nothing has gone "Wrong" the default nature of ASO data loads is to exclude missing and zero rows. This is to keepp the DB small. Depending on what version you are on, you can overried this default behavour in EAS or MaxL.
-
Customer Standard Dataloader issue
Hello,
I am trying to use the Dataloader to load the customer profile. I am not sure how to navigate sideways through the various tabs. For instance, I want to navigate from the top block to the 'Classification' tab so that I can populate the Profile Class.
Thank you
SKI am trying to use the Dataloader to load the customer profile. I am not sure how to navigate sideways through the various tabs. For instance, I want to navigate from the top block to the 'Classification' tab so that I can populate the Profile Class.Do you mean DataLoad? If yes, please see the documentation -- http://www.dataload.com/help/index.htm
Thanks,
Hussein -
2LIS_02_SCL Datasource dataload issue
Hi Sdns,
i am using 2lis_02_scl datasource, i am seeing the strange behaviour of the datasource.
I have filled the setuptables without any issues. then i am trying to load the data to the BI by using the Repair Full request infopackage. I found that some of the PO docuements are missing in BI. i have done the extraction for the same misssing PO's in RSA3 in R/3.
In RSA3, i am able to the see the data for missing PO's but these PO's are not getting transferred to the BI firsttime.
i again loaded the data to the BI without any selections in infopackage also, i can able to get the data for some missing records but not all.
for more information, this is happening in Quality systems.
Please help me on this issue.
Thanks in Advance
karunakar
Edited by: karunakark on Apr 5, 2011 3:14 PMWhere are you checking the data in BW?
Is it the PSA or the DSO?
And, you said that you are able to see the Data in RSA3 for PO's which do not get loaded to BI.
Please try to fetch such a PO using Repair Full -> Infopackage Selection -> Po Number.
See if you are able to see the data in the PSA of 2LIS_02_SCL. -
Hi All,
I have loaded the data through flat file,but in essbase it doesn't refleted.pls tell me what might be the cause?
Thanks in advance...Did you look at the exact intersection where the data was loaded? If not, did you calculate the database? Loads usually happen at lower levels and to look at the data at upper levels, you need to perform a calculation.
-
Hi All,
In our BW system, while doing load from APO system request gets fail by giving error message " Error 7 while sending an idoc" and " Error occured in Data seletion"
Thanks
Sunil
Edited by: sunil chowdary on Jun 30, 2010 4:32 PMHi,
Check the RFC connections in SM58. Also check BD87 and WE20 for the IDOC's.This error will generally come if RFC's are not happening properly.
Now, delete the request from the targets by making the QM status to red and repeat the load. If it is happening again, contact basis people regarding the RFC connections between the systems.
Regards,
Jack -
Hi All,
Can anyone list out the problems you have faced in Reporting Enviroment in Support project..
what kind of tickets will get from client side in Reporting ?
Thanks
Linda ([email protected])Hi Linda,
If you are taking up the reporting issues also in support project apart from dataload issues you will get across issues/tickets like:
1.Data not matching/incorrect
2.changes in report structure (may be they need 1 or 2 fields extra with some calculation involved)
3.Need some Hierarchies to be included in reports(eg: for new product/materials included)
4.validation of source data(eg:R/3) & BW data especiall for some applications like COPA and FICO.
And many more issues based on the client requirements.
Hope it helps...
Best Regards,
DMK
*Assing points if it helps... -
Hi All,
How I can find the date and time(log detail) - When the load of Attribute groups start and finish and when the Flex fields compiled.
Thanks,
SalUsing dataload, after loading some records application is crashing. Then we need to reduce speed. How I can avoid this issue.Are there any log files generated?
What could be the reason for this ( EBS issue or dataload issue).I cannot really tell since this could be related to EBS or Dataload.
Here we are using dataload classic.
Please help me on this.Did you see DataLoad user guide? -- http://www.dataload.com/help/support/gettingsupport.htm
You may also ask in DataLoad Forum -- http://www.dataload.com/forum
Thanks,
Hussein -
Cannot uniquely identify member
Hi all,
while Building the dimensions in essbase i am getting a warning "Can not uniquely identify member......" in dataload.err file.
Has any body come across this kind of error.
also i am building the dimensions from relational database using parent/child references.
i have enabled duplicate members in the outline.
awaiting reply at the earliest.
thanks in advance.As Glenn said...
Turning duplicates on is one of those "grab gun, aim at foot, pull trigger" options. Apart from the dataload issues, the issues associated with "disambiguation" on the reporting side is utter insanity! It may not get you right away, but sooner or later you are guaranteed to find yourself "in the ermergency room" when you use this option.
It's bad enough when accounting/finance calls down to ask why their "revenue is wrong" and you have to play 20 questions to narrow the issue down to something actionable, now you have even more vagueness to deal with.
Of course, there are times when this seems perfectly "okay", as in using Month names in a year structure of a time dimension... but the additional context should have just been added to the members in the first place to avoid the redundancy of having to use them everywhere you turn anyway.
I cannot imagine a situation in which turning the duplicate member names on is better than the two or three common alternatives to this nightmare. -
Can anyone please explain in detail on how to handle this DATALOAD issues. I would like to understand for following scenario in BI 7.0 and R/3 4.6c
1) R/3 --> PSA --> DSO --> CUBE
So, there are one infoPackage and 2 DTP's are involved.
Let us take take a scenario, Please explain me in detail (Steps), on how to fix the LOAD, if the LOAD failed in R/3 --> PSA or PSA --> DSO or DSO --> CUBE.
I would appreciate your help in advance and points will rewards, of course.
BI DeveloperHi.......
1) Generally load fails in R/3 --> PSA due to RFC connection issue......You can check the RFC connection SM59.....
If it fails........you can make the status red.....and delete the failed request from PSA........again load it........
2) Load from PSA to DSO may fail due to many reasons....
a) Lock issue...>> in this case you can check the lock in SM12........wait for the lock get released...........then make the QM status red......delete the request from the target..again repeat the load............
Or may be due to locked by change run........in this case also after the completion of ACR....you have to repeat the load..
b) load failed because last delta failed.........In this case first you have to rectify the last delta.........then repeat this load....
3) DSO --> CUBE.........a)here also the load may fail due to lock issue...in this case also you have to delete the request from the target.after making the QM status red........then lock get released you have to repeat the load.......
b)while loading data from DSO to infocube , the dtp run failed due to short dump.The error anlysis gives description as DBIF_RSQL_SQL_ERROR,
This error is usually seen when the table behind ODS is getting updated from more than one source system simultaneously. Here the data packets from more than one request contain one or more common records. This gives rise to deadlock while updating inserting records into the ODS table and subsequently a short dump is encountered leading to the failure of one request. The solution is to cancel the loads and run them serially. A possible long term solution is to load the data up to the PSA in parallel and then load the ODS table from the PSA in serial. (Here change the InfoPackage to option PSA only and tick Update subsequent data targets).
Solution
It may be possible that the job is set up in such a way that the activation of data in the ODS takes place only after data is received from all four regions. Thus, if the failing request is deleted the correct requests will also be deleted. Hence it is required to change the status of the failed job from red/yellow to green, activate the ODS data and then re-start the load from the (failing) source after correction of any possible errors.
c) while loading the active data that is in the first ODS into a cube - it fails with the following error.
err #1 )Data Package 1 : arrived in BW ; Processing : Error records written to application log
err #2 ) Fiscal year variant C1 not expected
Sol : 'GO to SPRO, SAP Customizing guide
Open the tree:
SAP Netweaver
SAP Business Information Warehouse
Maintain FIscal Year Variant
Hope this helps you.....
Regards,
Debjani........
Edited by: Debjani Mukherjee on Sep 24, 2008 8:47 PM
Edited by: Debjani Mukherjee on Sep 24, 2008 8:53 PM
Edited by: Debjani Mukherjee on Sep 24, 2008 8:54 PM -
Hi all,
The delta load won't include historical data and BC use delta data.
--How can I load data including histical data if I use BC and finished first load from R/3 to BW.
--What data did I transfer from R/3 to BW when I finished first data load? Is it current data?
--in OMO1 setup, can anyone please give me an example for 'posting period' option? Do I have to assign 'fiscal year variant'? It seems that I only have 'posting period' option for V3 update type. I selected 'weekly' when I use V3 and I lost all data in the datasource. Why?
Thank you very much for your help.Hi John,
The delta load won't include historical data and BC use delta data.
<i>What you mean by this? If you want to bring all the R/3 data (whatever available in R3), then Delta initialization WILL bring all the data. After which, the Delta Updates will bring only the changed data in R/3. So, it is not true that you wont get hisotry data with Delta extraction. {Perhaps, I misunderstood your defintion of "History Data". pls clarify)</i>
--How can I load data including histical data if I use BC and finished first load from R/3 to BW.
<i>When you Initialize Delta without any selections. It should bring all the data in R3. What ither data you are expecting. </i>
--What data did I transfer from R/3 to BW when I finished first data load? Is it current data?
<i>Depends on the Selections you are using during your extraction!</i>
Regards,
Sree -
QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES
WHAT ARE QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
WHAT ARE DATALOADING PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
WILL REWARD FULL POINT S
REGARDS
GURUBW Back end
Some Tips -
1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 Background Processing Job Management to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 ABAP/4 Run-time Analysis and then run the analysis for the transaction code RSA3 Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW BW IMG Menu on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
Hope it Helps
Chetan
@CP.. -
In the recent past I have been able to use a program called DATALOAD (shareware) to load data into Firefox web forms (specifically web based sales tax returns).
However, now every time I try it now Firefox crashes, so I am forced to use IE (which I hate). The last time I used Dataload was in April 2011 (for my quarterly tax returns).
On of the latest updates to Firefox must have a conflict with this program.
Can you please research to see what changed in Firefox to make it crash when using Dataload? I do not want to use IE if I do not have to.Hmmm, That is very odd. What is the link you are using? Are you logged
in to YouTube?
Many site issues can be caused by corrupt cookies or cache.
* Clear the Cache and
* Remove Cookies<br> '''''Warning ! ! '' This will log you out of sites you're logged in to.'''
Type '''about:preferences'''<Enter> in the address bar.
* '''Cookies;''' Select '''Privacy.''' Under '''History,''' select Firefox will '''Use Custom Settings.''' Press the button on the right side called '''Show Cookies.''' Use the search bar to look for the site. Note; There may be more than one entry. Remove '''All''' of them.
* '''Cache;''' Select '''Advanced > Network.''' Across from '''Cached Web Content,''' Press '''Clear Now.'''
If there is still a problem,
'''[https://support.mozilla.org/en-US/kb/troubleshoot-firefox-issues-using-safe-mode Start Firefox in Safe Mode]''' {web link}
While you are in safe mode;
Type '''about:preferences#advanced'''<Enter> in the address bar.
Under '''Advanced,''' Select '''General.'''
Look for and turn off '''Use Hardware Acceleration'''.
Poke around safe web sites. Are there any problems?
Then restart. -
CSV creation through PLSQL having issue with some rows of data
Hi,
having issue with in some rows of data which is created by the PLSQL code.
Issue description
- 50000 rows of data in my CSV
-around 20 fileds in a row
- in some the row (in 50000 rows) some fields (20 rows of data) are returning empty as result but actually my query returns data for these fields
Hope the issue is clear
Code given below
CREATE OR REPLACE FUNCTION TSC_OM_AUDIT(ERR_DESC IN OUT VARCHAR2,
WEB_FILE_URL IN OUT VARCHAR2 ) RETURN BOOLEAN IS
--Variable Declaration
L_buff_line VARCHAR2(500);
L_hdr_str VARCHAR2(32700);
L_selling_UOM VARCHAR2(5);
L_prim_bar_code VARCHAR2(30);
L_status_pri_bar VARCHAR2(2);
L_multpl_bar_exist VARCHAR2(2);
L_multpl_prim_bar_exist VARCHAR2(2);
L_prim_simp_pack VARCHAR2(30);
L_staus_prim_simp_pack VARCHAR2(2);
L_mupl_prim_pack_exist VARCHAR2(2);
L_prim_supp_prim_simpl_pack VARCHAR2(15);
L_prim_sim_supp_to_TPNB VARCHAR2(2);
L_sel1 VARCHAR2(200);
L_sel2 VARCHAR2(200);
L_sel3 VARCHAR2(200);
L_item_till_desc VARCHAR2(200);
L_lengh1 VARCHAR2(200);
L_width1 VARCHAR2(200);
L_height1 VARCHAR2(200);
L_lengh2 VARCHAR2(200);
L_width2 VARCHAR2(200);
L_height2 VARCHAR2(200);
L_lengh3 VARCHAR2(200);
L_width3 VARCHAR2(200);
L_height3 VARCHAR2(200);
--Item type
CURSOR C_ITEM is
select im.item L_item,
im.status L_status,
im.item_desc L_item_desc,
'Regular' L_item_type,
im.dept L_dept,
im.class L_class,
im.subclass L_subclass,
'N/A' L_CW_item_order_type,
'N/A' L_CW_item_sale_type,
im.standard_uom L_standard_uom,
NULL L_selling_uom,
im.tsl_base_item L_tsl_base_item
from item_master im
where im.item_number_type='TPNB'
-- and im.last_update_id = 'DATALOAD'
-- and im.create_datetime>=to_timestamp('2010-11-05 10:57:47','YYYY-MM-DD HH24:MI:SS')
and im.sellable_ind='Y'
and im.orderable_ind='Y'
and im.inventory_ind='Y'
and im.item_xform_ind='N'
and im.catch_weight_ind='N'
and rownum<10;
--order by im.item asc;
Cursor C_Selling_UOM (tpnb varchar2) is
select selling_uom
from rpm_item_zone_price
where item = tpnb;
Cursor C_prim_bar (tpnb varchar2) is
select item,
status
from item_master iem
where item_parent = tpnb
and iem.primary_ref_item_ind ='Y';
Cursor C_multi_bar_exit (tpnb varchar2) is
select count(*)
from item_master iem
where item_parent = tpnb;
Cursor C_multpl_prim_bar_exist (tpnb varchar2) is
select count(*)
from item_master
where item_parent = tpnb
and primary_ref_item_ind ='Y';
Cursor C_staus_prim_simp_pack (tpnb varchar2) is
select piem.pack_no,
iem.status
from item_master iem,
packitem piem
where piem.item = tpnb
and piem.pack_no = iem.item
and iem.tsl_prim_pack_ind ='Y';
Cursor C_multpl_prim_pack_exist (tpnb varchar2) is
select count(*)
from item_master iem,
packitem piem
where piem.item = tpnb
and piem.pack_no = iem.item
and iem.tsl_prim_pack_ind ='Y';
Cursor C_prim_supp_prim_simpl_pack (tpnd varchar2) is
select supplier
from item_supplier
where item = tpnd
and primary_supp_ind= 'Y';
Cursor C_prim_sim_supp_to_TPNB (tpnb varchar2,suppl number) is
select 'Y'
from item_supplier
where item = tpnb
and supplier = suppl;
Cursor C_item_descretion_SEL (tpnb varchar2) is
select sel_desc_1,
sel_desc_2,
sel_desc_3
from tsl_itemdesc_sel
where item =tpnb;
Cursor C_item_till_descretion (tpnb varchar2) is
select till_desc
from tsl_itemdesc_till
where item=tpnb;
Cursor C_EA (tpnb varchar2,suppl number) is
select length,
width,
height
from item_supp_country_dim
where item= tpnb
and supplier = suppl
and dim_object='EA';
Cursor C_CS (tpnb varchar2,suppl number) is
select length,
width,
height
from item_supp_country_dim
where item= tpnb
and supplier = suppl
and dim_object='TYUNIT';
Cursor C_TRAY (tpnb varchar2,suppl number) is
select length,
width,
height
from item_supp_country_dim
where item= tpnb
and supplier = suppl
and dim_object='TRAY';
BEGIN
--INITIAL
WEB_FILE_URL := TSC_RPT_GRS_EXL_GEN.WEB_URL||TSC_RPT_GRS_EXL_GEN.OPEN_FILE;
TSC_RPT_GRS_EXL_GEN.put_line('Poland Production Cutover');
TSC_RPT_GRS_EXL_GEN.skip_line;
l_hdr_str := 'L_item_TPNB,L_status,L_item_desc,L_item_type,L_section,L_class,L_subclass,L_cw_order_type,L_cw_sale_type,L_std_UOM,L_selling_UOM,'||
'L_base_item,L_prim_bar_code,L_status_pri_bar,L_multpl_bar_exist,L_multpl_prim_bar_exist,L_prim_simple_pack,L_staus_prim_simp_pack,L_mupl_prim_pack_exist,'||
'L_prim_supp_prim_simpl_pack,L_prim_sim_supp_to_TPNB,L_sel1,L_sel2,L_sel3,L_item_till_desc,L_lengh1,L_width1,L_height1,L_lengh2,L_width2,L_height2,L_lengh3,L_width3,L_height3';
l_hdr_str := TSC_RPT_GRS_EXL_GEN.CONVERT_SEPARATOR(l_hdr_str);
TSC_RPT_GRS_EXL_GEN.put_line(l_hdr_str);
for rec_c_item in C_ITEM
LOOP
open C_Selling_UOM (rec_c_item.L_item);
fetch C_Selling_UOM into L_selling_UOM;
close C_Selling_UOM;
open C_prim_bar (rec_c_item.L_item);
fetch C_prim_bar into L_prim_bar_code,L_status_pri_bar;
close C_prim_bar;
open C_multi_bar_exit (rec_c_item.L_item);
fetch C_multi_bar_exit into L_multpl_bar_exist;
close C_multi_bar_exit;
IF to_number(trim(L_multpl_bar_exist)) > 1 THEN
L_multpl_bar_exist:='Y';
ELSE
L_multpl_bar_exist:='N';
END IF;
open C_multpl_prim_bar_exist (rec_c_item.L_item);
fetch C_multpl_prim_bar_exist into L_multpl_prim_bar_exist;
close C_multpl_prim_bar_exist;
IF to_number(trim(L_multpl_prim_bar_exist)) > 1 THEN
L_multpl_prim_bar_exist:='Y';
ELSE
L_multpl_prim_bar_exist:='N';
END IF;
open C_staus_prim_simp_pack (rec_c_item.L_item);
fetch C_staus_prim_simp_pack into L_prim_simp_pack,L_staus_prim_simp_pack;
close C_staus_prim_simp_pack;
open C_multpl_prim_pack_exist (rec_c_item.L_item);
fetch C_multpl_prim_pack_exist into L_mupl_prim_pack_exist;
close C_multpl_prim_pack_exist ;
IF to_number(trim(L_mupl_prim_pack_exist)) > 1 THEN
L_mupl_prim_pack_exist:='Y';
ELSE
L_mupl_prim_pack_exist:='N';
END IF;
open C_prim_supp_prim_simpl_pack (trim(L_prim_simp_pack));
fetch C_prim_supp_prim_simpl_pack into L_prim_supp_prim_simpl_pack;
close C_prim_supp_prim_simpl_pack ;
open C_prim_sim_supp_to_TPNB (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
fetch C_prim_sim_supp_to_TPNB into L_prim_sim_supp_to_TPNB;
close C_prim_sim_supp_to_TPNB ;
open C_item_descretion_SEL (rec_c_item.L_item);
fetch C_item_descretion_SEL into L_sel1,L_sel2,L_sel3;
close C_item_descretion_SEL ;
open C_item_till_descretion (rec_c_item.L_item);
fetch C_item_till_descretion into L_item_till_desc;
close C_item_till_descretion ;
open C_EA (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
fetch C_EA into L_lengh1,L_width1,L_height1;
close C_EA ;
open C_CS (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
fetch C_CS into L_lengh2,L_width2,L_height2;
close C_CS ;
open C_TRAY (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
fetch C_TRAY into L_lengh3,L_width3,L_height3;
close C_TRAY ;
L_buff_line := TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item), TRUE)||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_status))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_desc))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_type))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_dept))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_class))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_subclass))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_order_type))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_sale_type))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_standard_uom))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_selling_UOM)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_tsl_base_item))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_bar_code)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_status_pri_bar)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_bar_exist)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_prim_bar_exist)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_simp_pack)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_staus_prim_simp_pack)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_mupl_prim_pack_exist)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_supp_prim_simpl_pack)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_sim_supp_to_TPNB)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel1)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel2)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel3)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_item_till_desc)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh1)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width1)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height1)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh2)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width2)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height2)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh3)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width3)))||
TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height3)));
TSC_RPT_GRS_EXL_GEN.PUT_LINE(L_buff_line);
L_selling_UOM :=NULL;
L_prim_bar_code :=NULL;
L_status_pri_bar :=NULL;
L_multpl_bar_exist :=NULL;
L_multpl_prim_bar_exist :=NULL;
L_prim_simp_pack :=NULL;
L_staus_prim_simp_pack :=NULL;
L_mupl_prim_pack_exist :=NULL;
L_prim_supp_prim_simpl_pack :=NULL;
L_prim_sim_supp_to_TPNB :=NULL;
L_sel1 :=NULL;
L_sel2 :=NULL;
L_sel3 :=NULL;
L_item_till_desc :=NULL;
L_lengh1 :=NULL;
L_width1 :=NULL;
L_height1 :=NULL;
L_lengh2 :=NULL;
L_width2 :=NULL;
L_height2 :=NULL;
L_lengh3 :=NULL;
L_width3 :=NULL;
L_height3 :=NULL;
END LOOP;
TSC_RPT_GRS_EXL_GEN.close_file;
return TRUE;
EXCEPTION WHEN OTHERS THEN
ERR_DESC := '['||SQLCODE||']-'||SUBSTR(SQLERRM,1,200);
return FALSE;
END TSC_OM_AUDIT;
Please suggest something on this
Regards,
Shamed HHi Shamid,
This forum is only for questions regarding the SQL Developer tool. Please post in Forum Home > Database > SQL and PL/SQL:
PL/SQL
Regards,
Gary
SQL Developer Team -
Runtime Error DBIF_RSQL_INVALID_RSQL on dataload to PSA
Hi,
Apologies for the long problem statement.
We need to extract data from a non- SAP SQL system to our SAP Netweaver 7.01 (EhP1 SP5) BI system. We have established a UD Connection and connected to tables in the SQL system.
After a few dataloads and data validity, some fields were changed in the source tables in the SQL system.
Now, when we try to extract data into our PSA by triggering InfoPackage, it gives a Runtime Error with following messages:
*Runtime Errors DBIF_RSQL_INVALID_RSQL*
*Exception CX_SY_OPEN_SQL_DB*
*Error in module RSQL of the database interface.*
*An exception occurred that is explained in detail below.*
*The exception, which is assigned to class 'CX_SY_OPEN_SQL_DB', was not caugh*
*in*
*procedure "INSERT_ODS" "(FORM)", nor was it propagated by a RAISING clause.*
*Since the caller of the procedure could not have anticipated that the*
*exception would occur, the current program is terminated.*
*The reason for the exception is:*
*In a SELECT access, the read file could not be placed in the target*
*field provided.*
*Either the conversion is not supported for the type of the target field,*
*the target field is too small to include the value, or the data does not*
*have the format required for the target field.*
We tried creating another datasource on the same table and we could extract data successfully. There a quite a few more DS that have the same issue. Is there another way to resolve this issue while retaining current DS.
Also, for another table which has more number of rec (arnd 20K), the extraction keeps running even after creating new datasource.
Please help.
-Abhishek.
Edited by: Abhishek Rajan on Sep 28, 2010 8:05 PMHi Rajan,
Please find this thread.
I guess it may be useful. When datasource structrue changes , there will inconsistency in database and will not allow us to delete or do data load.
This has resolved my problem.
PSA Not Deleting, error : DDL time(___1):.....0 milliseconds
Thanks
Vamsi
Maybe you are looking for
-
Can not print from network share.
When a PDF document is printed from a network share whe're getting this error: Windows cannot find 'T:\temp\test.pdf'. Make sure you typed the name correctly, and then try again. To search for a file, click the Start button, and then click Search. Th
-
Amount of max possible simultaneous INSERTs depend on what ??
Hi all, can somebody comment on following statement , whether its is absolutely true, only a part of the truth or completely wrong : "The maximum possible amount of transactions that are simultaneously performing inserts into an Oracle- table is MAIN
-
Conversion customer to Business partner: R/3 -- XI -- R/3
Dears, For the implementation of Credit Management, we needed following setup: 1. Customer data is sent to XI by IDOC 2. The IDOC is mapped to a Business partner in XI 3. The business partner is sent back to the R/3 system with the ABAP proxy. SXMB_M
-
How to add user from domain A to a group in domain B
How would you acheive adding a user from domain A to a group that is in domain B via powershell without the Quest cmdlets? I've been trying to figure this out for about a week now. Please let me know if the scripting guy has seen this issue before. L
-
Whenever I turn on iTunes Match on my ipad it warns me that all music currently on my ipad will be replaced by songs and lists from iTunes Match...in the process I lose a lot off music, not to mention time put in....why does it do this?