Checking for data in tables
hi gurus,
i have an itab which has fi transactional data , which inlcudes glaccounts with the name belnr.
now i need to check if each glaccount has a master data , if a match is not found that glaccounts must be stored to display later.
so i thought of doing this way, is it a best practise to do this way.
tables: skb1.
data : i_skb1 like skb1 occurs 0 with header line.
data:begin of i_saknr occurs 0,
saknr like skb1-saknr,
end of i_saknr.
loop at itab.
select single * from skb1 into i_skb1 where saknr EQ itab-belnr.
if sy-subrc <> 0.
i_saknr-saknr = itab-belnr.
append i_saknr.
endif.
endloop.
is there a better way of doing this.
Sanjana,
Do like this
tables: skb1.
data : i_skb1 like skb1 occurs 0 with header line.
types:begin of ty_saknr,
saknr like skb1-saknr,
end of ty_saknr.
data: i_saknr type table of ty_saknr,
wa_saknr type ty_saknr.
select * from skb1 into table i_skb1 for all entries in itab where saknr EQ itab-belnr.
loop at itab into wa.
read table itab into wa with key saknr = wa-belnr.
if sy-subrc <> 0.
wa_saknr-saknr = wa-belnr.
append wa_saknr to i_saknr.
endif.
endloop.
Regards,
Satish
Similar Messages
-
So i have a table with one column:
SUMA_LEI
LEI
I have a form made for this table. My purpose is to have just on row for this table, so when the user gets to the form page of this table, it gets an error like: "delete previous value, then insert new one". For this I need to make a pl/sql process that checks if there is a row inserted in the table and if that is the case, then I get an error, if not, im allowed to continue on that page.
I need some help as soon as possible if you can please help me. I know that I need some "if clause", but beyond that I don't know how to do itA possible solution is create a materialized view and then a check constraint on the materialized view: this should work in a multiuser environment.
Example:
SQL> drop table t;
Table supprimee.
SQL>
SQL> create table t (
2 x integer,
3 y varchar2(10)
4 );
Table creee.
SQL>
SQL> CREATE MATERIALIZED VIEW LOG on t
2 WITH ROWID (x, y)
3 including new values;
Journal de vue materialisee cree.
SQL>
SQL> CREATE MATERIALIZED VIEW t_mv
2 REFRESH FAST ON COMMIT AS
3 SELECT count(*) cnt
4 FROM t;
Vue materialisee creee.
SQL>
SQL> ALTER TABLE t_mv
2 ADD CONSTRAINT chk check(cnt<=1);
Table modifiee.
SQL>
SQL> insert into t values(1,'Ok');
1 ligne creee.
SQL> commit;
Validation effectuee.
SQL>
SQL> insert into t values(2,'KO');
1 ligne creee.
SQL> commit;
commit
ERREUR a la ligne 1 :
ORA-12008: erreur dans le chemin de regeneration de la vue materialisee
ORA-02290: violation de contraintes (TEST.CHK) de verification -
Performance check for status control table
Hi,
When ever we activate any ODS or delete the request ,a message 'Performance Check for Status control Table ' appears.
What does it mean?Please explain what is status control table and how it performs?
Thanks & Regards,
Dinakar.Hi,
This message is not a issue in itself. It is just a SAP message which means that the system is checking/updating the system tables to carry out the operation whether it can be done or not.
If your loads are failing at this step or after this message there can be many reasons, check some of the possible reasons below
Performing check and potential update for status control table
Update from PSA error in Process Chain
problem in deleting request
Reporting not available ,even if Data is sucessfully loaded into DT
Hope this helps,
Kush kashyap -
My Apple TV froze while trying to access Netflix. I unplugged the Apple TV from the wall and re plugged it. Now it stays on the loading time and date setting. After 10 minutes, I held menu, and hit restart (as the troubleshoot said). Same problem is still occurring with the continuous "checking for date and time". And if I hold menu to bypass this, I only have my settings and computer icon, all other icons have disappeared. Not quite sure what to do, any comments would surely help.
Thanks guys, appreciate it.
MattApple TV 1st generation or 2nd?
I would try plugging in the HDMI from the ATV to a different HDMI port on your TV and even trying the ATV on a different TV altogether...sounds like the HDMI port on your TV might be bad or a bad connection. Keep in mind you'll need to switch the TV input to the new HDMI port -
I need to various data structure table for MRS
A possible solution is create a materialized view and then a check constraint on the materialized view: this should work in a multiuser environment.
Example:
SQL> drop table t;
Table supprimee.
SQL>
SQL> create table t (
2 x integer,
3 y varchar2(10)
4 );
Table creee.
SQL>
SQL> CREATE MATERIALIZED VIEW LOG on t
2 WITH ROWID (x, y)
3 including new values;
Journal de vue materialisee cree.
SQL>
SQL> CREATE MATERIALIZED VIEW t_mv
2 REFRESH FAST ON COMMIT AS
3 SELECT count(*) cnt
4 FROM t;
Vue materialisee creee.
SQL>
SQL> ALTER TABLE t_mv
2 ADD CONSTRAINT chk check(cnt<=1);
Table modifiee.
SQL>
SQL> insert into t values(1,'Ok');
1 ligne creee.
SQL> commit;
Validation effectuee.
SQL>
SQL> insert into t values(2,'KO');
1 ligne creee.
SQL> commit;
commit
ERREUR a la ligne 1 :
ORA-12008: erreur dans le chemin de regeneration de la vue materialisee
ORA-02290: violation de contraintes (TEST.CHK) de verification -
Create csv file for data in tables
Hi All,
I need to "export" data for about 60 tables in one of my databases to a csv file format using a "|" as a separator.
I know I can do this using a query like:
select col1 || '|' || col2 || '|' || col3 from table;
Some of my tables have more than 50 columns so I'm guessing there is an easier way to do this than to construct select SQL statements for each table?
Thanks in advance.I would point out that the OP did not identify the target for the files so it could be an non-Oracle database or Excel in which case external tables would not work since Oracle only writes to external tables in datapump format. If the target is another Oracle database then external tables would be an option. If external tables are an option then insert/select over a database link would be a potential alternate approach to using csv files.
I use a SQL script to generate the Select statement to create my csv files.
set echo off
rem
rem SQL*Plus script to create comma delimited output file from table
rem
rem 20000614 Mark D Powell Automate commonly done task
rem
set pagesize 0
set verify off
set feedback off
set linesize 999
set trimspool on
accept owner prompt 'Enter table owner => '
accept tblname prompt 'Enter table name => '
spool csv2.sql
select 'select ' from sys.dual;
select decode(column_id,1,column_name,
'||''|''||'||column_name)
from sys.dba_tab_columns
where table_name = upper('&&tblname')
and owner = upper('&&owner')
order by column_id;
select 'from &&owner..&&tblname;'
from sys.dual;
spool off
undefine owner
undefine tblname
HTH -- Mark D Powell -- -
SQLPS - Checking for a specific table
I am trying to check for a table existence in each database. Any ideas why this does not work?
Import-Module
"sqlps" -DisableNameChecking
$SQLServer
= "wobdslezberg2"
Set-Location
SQLSERVER:
$dbArray
= Get-ChildItem
-name -Exclude
"tempdb*",
"master*",
"model*",
"msdb*" |
Sort-Object
foreach ($DatabaseName
in $dbArray)
$instance =
$DatabaseName
#write-host $instance
if (test-path
"SQLSERVER:\SQL\wobdslezberg2\DEFAULT\Databases\$instance\Tables\eConfig") {write-host
"exists"}
else {write-host
"not exists"}I am trying to iterate through all the databases on the server and check for the existence of a specific table in each database.
Actually that line does work fine because if I comment out the if test-path line, I get results as expected. and I get list of tables like this
dbo.SRC04_TABLES
How do I put in the schema (dbo) in the test-path?
The test-path does not seem to work as I get this message
Test-Path : Cannot retrieve the dynamic parameters for the cmdlet. SQL Server PowerShell provider error: The number of keys specified does not match the number of keys required to address this object. The number of keys required are: Name.
At line:19 char:9
+ if (test-path "SQLSERVER:\SQL\wobdslezberg2\DEFAULT\Databases\$instance\Tabl ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Test-Path], ParameterBindingException
+ FullyQualifiedErrorId : GetDynamicParametersException,Microsoft.PowerShell.Commands.TestPathCommand
Thanks
Scott -
Authorization check for a program/table
Hi ,
Can anyone help me out in
How to do authorization check for an abap program and also a table.
I have no idea about the authorizations.
My requirement is that I need to do the authorization check in such a manner that only users having a certain profile
1. should be able to execute the program
2. View of the entries of the table.
Thanks & Regards,
KeerthiHello Keerhi ,
I got you wrong at first!
If you want to have only certain users to be able to do certain operations, then you need to assign the appropriate roles to those users!
First find the role
second add the user in the role ( PFCG T code---> USers tab)
Raj -
Can you check for data in one table or another but not both in one query?
I have a situation where I need to link two tables together but the data may be in another (archive) table or different records are in both but I want the latest record from either table:
ACCOUNT
AccountID Name
123 John Doe
124 Jane Donaldson
125 Harold Douglas
MARKETER_ACCOUNT
Key AccountID Marketer StartDate EndDate
1001 123 10526 8/3/2008 9/27/2009
1017 123 10987 9/28/2009 12/31/4712 (high date ~ which means currently with this marketer)
1023 124 10541 12/03/2010 12/31/4712
ARCHIVE
Key AccountID Marketer StartDate EndDate
1015 124 10526 8/3/2008 12/02/2010
1033 125 10987 01/01/2011 01/31/2012
So my query needs to return the following:
123 John Doe 10526 8/3/2008 9/27/2009
124 Jane Donaldson 10541 12/03/2010 12/31/4712 (this is the later of the two records for this account between archive and marketer_account tables)
125 Harold Douglas 10987 01/01/2011 01/31/2012 (he is only in archive, so get this record)
I'm unsure how to proceed in one query. Note that I am reading in possibly multiple accounts at a time and returning a collection back to .net
open CURSOR_ACCT
select AccountID
from
ACCOUNT A,
MARKETER_ACCOUNT M,
ARCHIVE R
where A.AccountID = nvl((select max(M.EndDate) from Marketer_account M2
where M2.AccountID = A.AccountID),
(select max(R.EndDate) from Archive R2
where R2.AccountID = A.AccountID)
and upper(A.Name) like parameter || '%'
<can you do a NVL like this? probably not... I want to be able to get the MAX record for that account off the MarketerACcount table OR the max record for that account off the Archive table, but not both>
(parameter could be "DO", so I return all names starting with DO...)if I understand your description I would assume that for John Dow we would expect the second row from marketer_account ("high date ~ which means currently with this marketer"). Here is a solution with analytic functions:
drop table account;
drop table marketer_account;
drop table marketer_account_archive;
create table account (
id number
, name varchar2(20)
insert into account values (123, 'John Doe');
insert into account values (124, 'Jane Donaldson');
insert into account values (125, 'Harold Douglas');
create table marketer_account (
key number
, AccountId number
, MktKey number
, FromDt date
, ToDate date
insert into marketer_account values (1001, 123, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('27.09.2009', 'dd.mm.yyyy'));
insert into marketer_account values (1017, 123, 10987, to_date('28.09.2009', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
insert into marketer_account values (1023, 124, 10541, to_date('03.12.2010', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
create table marketer_account_archive (
key number
, AccountId number
, MktKey number
, FromDt date
, ToDate date
insert into marketer_account_archive values (1015, 124, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('02.12.2010', 'dd.mm.yyyy'));
insert into marketer_account_archive values (1033, 125, 10987, to_date('01.01.2011', 'dd.mm.yyyy'), to_date('31.01.2012', 'dd.mm.yyyy'));
select key, AccountId, MktKey, FromDt, ToDate
, max(FromDt) over(partition by AccountId) max_FromDt
from marketer_account
union all
select key, AccountId, MktKey, FromDt, ToDate
, max(FromDt) over(partition by AccountId) max_FromDt
from marketer_account_archive;
with
basedata as (
select key, AccountId, MktKey, FromDt, ToDate
from marketer_account
union all
select key, AccountId, MktKey, FromDt, ToDate
from marketer_account_archive
basedata_with_max_intervals as (
select key, AccountId, MktKey, FromDt, ToDate
, row_number() over(partition by AccountId order by FromDt desc) FromDt_Rank
from basedata
filtered_basedata as (
select key, AccountId, MktKey, FromDt, ToDate from basedata_with_max_intervals where FromDt_Rank = 1
select a.id
, a.name
, b.MktKey
, b.FromDt
, b.ToDate
from account a
join filtered_basedata b
on (a.id = b.AccountId)
ID NAME MKTKEY FROMDT TODATE
123 John Doe 10987 28.09.2009 31.12.4712
124 Jane Donaldson 10541 03.12.2010 31.12.4712
125 Harold Douglas 10987 01.01.2011 31.01.2012
If your tables are big it could be necessary to do the filtering (according to your condition) in an early step (the first CTE).
Regards
Martin -
Issue:
I have SAP BW system and SAP HANA System
SAP BW to SAP HANA connecting through a DB Connection (named HANA)
Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
Executed the Open Hub service without checking DELETING Data from table option
Data loaded with 16 Records from BW to HANA same
Second time again executed from BW to HANA now 32 records came ( it is going to append )
Executed the Open Hub service with checking DELETING Data from table option
Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
If checking in SAP BW system tio SAP BW system it is working fine ..
will this option supports through DB Connection or not ?
Please follow the attachemnet along with this discussion and help me to resolve how ?
From
Santhosh KumarHi Ramanjaneyulu ,
First of all thanks for the reply ,
Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
in that there is check box i have selected already that is what my issue even though selected also
not performing the deletion from target level .
SAP BW - to SAP HANA via DBC connection
1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
2. second time again executed from BW - now hana side appaended means 16+16 = 32
3. so that i used to select the check box at OH level like Deleting data from table
4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
Thanks
Santhosh Kumar -
[req] how to check whether given oracle table exist or not + C#
Hi,
All.
Actually i m creating a program using C# which connects to the database (oracle) and checks for the given table "OPC_GROUP". if the table doesnt exists then it creates the table else it updates all field with provided values
code
cmd.CommandText = "SELECT tname from tab where tname = 'OPC_GROUP'";
cmd.CommandType = CommandType.Text;
int length = cmd.ExecuteNonQuery();
if (cmd.ExecuteNonQuery() >0)
MessageBox.Show("Table does exist");
else
MessageBox.Show("Table doesnt exist");
But this code return "TABLE DOESNT EXIST" though table is already created
What i m doing wrong???
Please help..........
Thnx in advanceTry this..
cmd.CommandText = "SELECT count(*) from tab where tname = 'OPC_GROUP'";
cmd.CommandType = CommandType.Text;
string strnum = cmd.ExecuteScalar().ToString();
if (strnum!="0")
MessageBox.Show("Table does exist");
else
MessageBox.Show("Table doesnt exist");
Hope it helps,
Greg -
I need format for data in excel file load into info cube to planning area.
Hi gurus,
I need format for data in excel file load into info cube to planning area.
can you send me what should i maintain header
i have knowledge on like
plant,location,customer,product,history qty,calander
100,delhi,suresh,nokia,250,2011211
if it is right or wrong can u explain and send me about excel file format.
babuHi Babu,
The file format should be same as you want to upload. The sequence of File format should be same communication structure.
Like,
Initial columns with Characteristics (ex: plant,location,customer,product)
date column (check for data format) (ex: calander)
Last columsn with Key figures (history qty)
Hope this helps.
Regards,
Nawanit -
Issue with table ROOSPRMSF entries for data source 0FI_AP_4
Hi Experts,
I am facing with an issue where we found incosistencies with table ROOSPRMSF in R/3 system.
In BW , we have done initializations based on fiscal period selections (none of the selections overlap) for data source 0FI_AP_4.
We have done in total 7 initializations. So in BW system in table RSSDLINITSEL we have 7 initialization requests.
But in R/3 system we have 49 records for data source 0FI_AP_4 in ROOSPRMSF table out of which 42 are invalid records.
I suspect that these 42 invalid records are created due to the execution of program RSSM_OLTP_INIT_DELTA_UPDATE when the tables ROOSPRMSF are actually holding the 7 initialization request entries. Due to this each and every initialization request is linked to rest of the other intialization requests and ended with 49 records in table ROOSPRMSF table.
Now our data loads are running fine but daily a short dump is raised . In the daily loads, BW init records in RSSDLINITSEL are compared with ROOSPRMSF entries and all the other 42 records which are invalid are written into system log and a short dump is raised.
In order to fix these inconsistencies i checked for OSS note 852443. (Point 3 in OSS note)
But it is specified to delete the delta queue for data source 0FI_AP_4 in RSA7 and instructed to execute the program RSSM_OLTP_INIT_DELTA_UPDATE so that the ROOSPRMSF table will be reconstructed with valid records available in RSSDLINITSEL.
From OSS note 852443 point 3
"3. If the RSSDLINIT table in the BW system already contains entries, check the requests listed there in the RNR column in the monitor (transaction RSRQ). Compare these entries with the entries in the ROOSPRMSF and ROOSPRMSC tables with the INITRNR field. If, in the ROOSPRMSF and ROOSPRMSC tables for your DataSource source system combination, there are more entries with different INITRNR numbers, use transaction RSA7 in an OLTP source system to delete all entries and then use the RSSM_OLTP_INIT_DELTA_UPDATE report mentioned in the next section. For a DataMart source system, delete the entries that you cannot find in the RSSDLINIT table using the procedure described above."
My question is if we delete the delta queue in RSA7 then all the tables in R/3 (ROOSPRMSF, ROOSPRMSC, Time stamp table) and BW (RSSDLINITSEL, initialization requests will be deleted) will be cleared. Then how will the program RSSM_OLTP_INIT_DELTA_UPDATE copy entries into ROOSPRMSF table in R/3 ?
Could any one please clarify this ?
Thanks
Regards,
JeswanthHi Amarnath,
Did you unhide the new field in RSA6 and regenerated the DataSource?
Often SAP will populate newly added fields (belonging to the same (set) of table(s) used for extraction) automatically (e.g. SAP uses 'move-corresponding' in it's extractor-code, or, in this case, reading all fields from the DD, FM BWFIU_TRANSFORM_FIELDLIST).
If the DataSource looks fine to you and the field is still not populated in RSA3 you can't go without a user-exit.
Grtx,
Marco -
Hi,
We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
Also its only happening to one ledger rest all ledgers are working fine without any issues.
ThanksHi,
there are some Support documents related to this issue.
I would suggest you have a look to them.
Regards -
Designing metadataLogging table for data warehouse
Hi:
I am in the process of creating metadataLogging tables for dw.
I have thought 5 tables to track errors and check etl execution.
Master table for all major jobs. Enviournment - oracle 9i. Dw size not more than 30 mb. to start with.
CREATE TABLE ADMIN_JOB_MASTER_TBL
JOBNUMBER NUMBER,
JOBNAME VARCHAR2(768 BYTE),
MINTHRESHRECORDS NUMBER,
MAXTHRESHRECORDS NUMBER,
MINTHRESHTIME VARCHAR2(64 BYTE),
MAXTHRESHTIME VARCHAR2(64 BYTE),
CREATEDATE DATE
Audit Table for auditing jobs.
CREATE TABLE ADMIN_AUDIT_JOBS_TBL
JOBNUMBER NUMBER,
JOBNAME VARCHAR2(768 BYTE),
STARTDATE DATE,
ENDDATE DATE,
NUMBERRECORDS NUMBER,
FAIL_OR_SUCCESS VARCHAR2(64 BYTE)
Step Master: for jobs involving various steps, logic such as branching, loop, flow etc. breaking the job_master to a lower level for a more insight.
audit_step
CREATE TABLE ADMIN_AUDIT_STEP_TBL
RECORDID VARCHAR2(256 BYTE),
JOBAUDITID VARCHAR2(256 BYTE),
STEPNUMBER NUMBER,
PARAMETERS VARCHAR2(512 BYTE),
NUMBERRECORDS NUMBER,
STARTDATE DATE,
ENDDATE DATE,
USERNAME VARCHAR2(256 BYTE)
Error_table: to track error during execution.
CREATE TABLE ERROR_TBL
ERROR_NO NUMBER,
TABLE_NAME VARCHAR2(124 BYTE),
ERR_NUM NUMBER,
ERR_MSG VARCHAR2(124 BYTE),
ERROR_TIME DATE
I am thinking to load the master tables manually with expected values.
And during each run, loading the auditing tables.
Error table would ofcourse be loaded during the run.
So everytime a threshold has changed, I would have to update master table manually. I dont mind doing that initially.
Would the following tables and the stated appraoch be good for etl.
Please guys, let me know your thoughts. Suggest changes or tables or approach, if you feel, that its wrong.
Thanks in advance. All inputs are highly appreciated !!!
Regards
SomHi,
What better than Oracle suggests...is there ????
Have you read Oracle doc titled...
"Data Warehousing Guide" and the "Handling Data Errors with an Error Logging Table " section for example...
Regards,
Simon
Maybe you are looking for
-
Can't open any files in Photoshop Elements 9 and 11
I have Photoshop Elements 9 and 11, and right now I cannot open any files at all in either one. I keep getting this same error message every time: "Your request cannot be completed because the file format module cannot parse the file". Anybody have a
-
Hi Gurus, My company runs a VAT Report-Poland Every month and one thing they are finding out is that the total line items of the all documents posted(for specific Tax code) is different from deductible amounts at the bottom .Business claims that both
-
Unable to uninstall JavaFX Scene Builder 1.1 Preview on Windows 7 x64
When I try to uninstall, a window telling me that this version of the application is already installed is shown. The consequence is that I am unable to remove the application. I have used the following MSI-file: javafx_scenebuilder-1_1-beta-b15-windo
-
I would like to burn mp3 cds with iTunes for my car stereo. And the problem is that I can't organise my cds like I want, that is I would like to have folders: for exemple, lets say the lettre for my cd player is E, i would like it to do: E:\Artist\Al
-
License code - how do I find it?
I currently run ZENWorks imaging 4.0 at one of my locations, and I would like to install the ZENWorks imaging and WoL services on another one of my Netware 6.5 sp2 servers at a different location. Problem is that I am having trouble locating my licen