Bufferoverflow loading table RSBKCMD
Hi,
System: NW2004s
OS: Windows 2003 64bit
Instance: SAP BI 7.0
DB: Oracle: 10g
I am getting warning message in SM21: "Bufferoverflow loading table RSBKCMD"
More details
The free space in the table buffer is not sufficient to load the table.
The table should either not be buffered, or the table buffer area
should be increased in the profile (zcsa/table_buffer_area for the 100%
table buffer or rtbb/buffer_length for the partial table buffer).
I check profile zcsa/table_buffer_area its shows Active value 31744000 and default value 30000000
please tell me what sould I do now...
Your answer will be really appriciated
Thanks in Advanced
Angeline
Hi Angeline,
Check if SAP note 1302305 is helpful.
Cheers.....,
Raghu
Similar Messages
-
Error while impdp: ORA-02374: conversion error loading table
Hi,
I am trying to convert the character set from WE8ISO8859P1 to AL32UTF8 using expdp/impdp. for this I first convert WE8ISO8859P1 to WE8MSWIN1252 in source DB to get rid of “lossy” data. I created new database(target) with character set AL32UTF8 and nls_length_semantics = ’CHAR’ and created all the tablespaces as in source DB with auoexend on. I took full export (expdp) of source DB excluding TABLESPACE,STATISTICS,INDEX,CONSTRAINT,REF_CONSTRAINT and imported using impdp to target DB. I found below error in the import log file:
ORA-02374: conversion error loading table "SCTCVT"."SPRADDR_CVT"
ORA-26093: input data column size (44) exceeds the maximum input size (40)
ORA-02372: data for row: CONVERT_STREET_LINE1 : 0X'20202020202020202020202020202020202020202020202020'
I checked with select query on both DBs with below results.
source DB:
04:58:42 SQL> select count(*) from "SCTCVT"."SPRADDR_CVT";
COUNT(*)
74553
target DB:
04:59:24 SQL> select count(*) from "SCTCVT"."SPRADDR_CVT";
COUNT(*)
74552
please suggest me a solution to this.
Thanks and Regards.
Edited by: user12045167 on May 9, 2011 10:39 PMThanks for your update maher.
09:15:53 SQL> desc "SCTCVT"."SPRADDR_CVT"
Name Null? Type
SPRADDR_PIDM NUMBER(8)
CONVERT_PIDM VARCHAR2(9 CHAR)
SPRADDR_ATYP_CODE VARCHAR2(2 CHAR)
CONVERT_ATYP_CODE VARCHAR2(2 CHAR)
SPRADDR_SEQNO NUMBER(2)
CONVERT_SEQNO VARCHAR2(2 CHAR)
SPRADDR_FROM_DATE DATE
CONVERT_FROM_DATE VARCHAR2(8 CHAR)
SPRADDR_TO_DATE DATE
CONVERT_TO_DATE VARCHAR2(8 CHAR)
SPRADDR_STREET_LINE1 VARCHAR2(30 CHAR)
CONVERT_STREET_LINE1 VARCHAR2(40 CHAR)
SPRADDR_STREET_LINE2 VARCHAR2(30 CHAR)
CONVERT_STREET_LINE2 VARCHAR2(40 CHAR)
SPRADDR_STREET_LINE3 VARCHAR2(30 CHAR)
CONVERT_STREET_LINE3 VARCHAR2(40 CHAR)
SPRADDR_CITY VARCHAR2(20 CHAR)
CONVERT_CITY VARCHAR2(25 CHAR)
SPRADDR_STAT_CODE VARCHAR2(3 CHAR)
CONVERT_STAT_CODE VARCHAR2(25 CHAR)
SPRADDR_ZIP VARCHAR2(10 CHAR)
CONVERT_ZIP VARCHAR2(15 CHAR)
SPRADDR_CNTY_CODE VARCHAR2(5 CHAR)
CONVERT_CNTY_CODE VARCHAR2(5 CHAR)
SPRADDR_NATN_CODE VARCHAR2(5 CHAR)
CONVERT_NATN_CODE VARCHAR2(5 CHAR)
SPRADDR_PHONE_AREA VARCHAR2(3 CHAR)
CONVERT_PHONE_AREA VARCHAR2(3 CHAR)
SPRADDR_PHONE_NUMBER VARCHAR2(7 CHAR)
CONVERT_PHONE_NUMBER VARCHAR2(7 CHAR)
SPRADDR_PHONE_EXT VARCHAR2(4 CHAR)
CONVERT_PHONE_EXT VARCHAR2(4 CHAR)
SPRADDR_STATUS_IND VARCHAR2(1 CHAR)
CONVERT_STATUS_IND VARCHAR2(1 CHAR)
SPRADDR_ACTIVITY_DATE DATE
CONVERT_ACTIVITY_DATE VARCHAR2(8 CHAR)
SPRADDR_USER VARCHAR2(30 CHAR)
CONVERT_USER VARCHAR2(30 CHAR)
SPRADDR_ASRC_CODE VARCHAR2(4 CHAR)
CONVERT_ASRC_CODE VARCHAR2(4 CHAR)
SPRADDR_DELIVERY_POINT NUMBER(2)
CONVERT_DELIVERY_POINT VARCHAR2(2 CHAR)
SPRADDR_CORRECTION_DIGIT NUMBER(1)
CONVERT_CORRECTION_DIGIT VARCHAR2(1 CHAR)
SPRADDR_CARRIER_ROUTE VARCHAR2(4 CHAR)
CONVERT_CARRIER_ROUTE VARCHAR2(4 CHAR)
SPRADDR_GST_TAX_ID VARCHAR2(15 CHAR)
CONVERT_GST_TAX_ID VARCHAR2(15 CHAR)
SPRADDR_REVIEWED_IND VARCHAR2(1 CHAR)
CONVERT_REVIEWED_IND VARCHAR2(1 CHAR)
SPRADDR_REVIEWED_USER VARCHAR2(30 CHAR)
CONVERT_REVIEWED_USER VARCHAR2(30 CHAR)
SPRADDR_DATA_ORIGIN VARCHAR2(30 CHAR)
CONVERT_DATA_ORIGIN VARCHAR2(30 CHAR)
SPRADDR_CVT_RECORD_ID NUMBER(8)
SPRADDR_CVT_STATUS VARCHAR2(1 CHAR)
SPRADDR_CVT_JOB_ID NUMBER(8)
so here we can see its value is 40 (CONVERT_STREET_LINE1 VARCHAR2(40 CHAR)).
shall i go ahead altering the column? -
How to Load table from client file in C?
Hi all,
I'm trying to determine how to write a C program to load a client file with the "load table ... using client file..." syntax without calling out to the dbisql program to load the table. I'm probably not seeing the forest for the trees here. I know I can use the bulk api (equivalent of bcp) but would rather use the load table because it is much faster.
jasonThere is nothing special to do - you execute the statement from your program just like any other SQL statement. The only thing to be aware of are the privilege/permission issues:
When loading from a file on a client computer:
READ CLIENT FILE privilege is also required for the database user.
Read privileges are required on the directory being read from.
The allow_read_client_file database option must be enabled.
The read_client_file secure feature must be enabled.
Revoking these privileges is also the only way you can prevent a user from executing the statement. -
Hi All,
i am facing issue with apex 4.2.4 ,using the Data Load Table concept's and in this look up used the
Where Clause option ,it seems to be not working this where clause ,Please help me on thishi all,
it looks this where clause not filter with 'N' data ,Please help me ,how to solve this or help me on this -
ORA-02374: conversion error loading table during import using IMPDP
HI All,
We are trying to migrate the data from one database to an other database.
The source database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
US7ASCII
The destination database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
AL32UTF8
We took an export of the whole database using expdp and when we try to import to the destination database using impdp. We are getting the following error.
ORA-02374: conversion error loading table <TABLE_NAME>
ORA-12899: value too large for column <COLUMN NAME> (actual: 42, maximum: 40)
ORA-02372: data for row:<COLUMN NAME> : 0X'4944454E5449464943414349E44E204445204C4C414D414441'
Kindly let me know how to overcome this issue in destination.
Thanks & Regards,
Vikas KrishnaHi,
You can overcome this issue by increasing the column width in the target database for the max value required for all data to be imported successfully in the table.
Regards -
OMW hangs when loading table data?
Hello - I've got the OMW to read the xml version of my access database. However when I tell it to migrate to Oracle, it get through creating all the table users, tablespaces, etc and then it dies when it gets to "loading table data". Any ideas?
Thanks
RichardIf it is an MS Access database you are migrating from, then we create a schema with the same name as the mdb filename. So for file northwind.mwb we would create a schema called northwind. The default password is oracle unless you change it in the Oracle Model UI of the Workbench. In the Oracle Model looks under users to see what users exist.
Once the user is created you should simply be able to connect to that user via sqlplus and see all the schema objects/data migrated
Donal -
Mapping will not load tables in correct order
I am running OWB 10.2.
I have a set of tables with parent/child relationships enforced by FK constraints as follows: A -> B -> C and D
I created a mapping to load tables A, B and C and this works perfectly. I then copied to mapping and modified to load tables A, B and D, keeping basically the same flow particularly at the load end (i.e. just a few different attributes).
My problem is that when I generate the mapping it wants to load the tables in the order B then A then D. The original job generates the order A then B then C as expected.
I have tried setting the Target Load Order property such that A = 1, B = 2 and D = 3 but this makes no difference to the generated PL/SQL.
Has anyone else had a similar problem?ok, figured it out.
I was putting in the load order in each table operator not the mapping properties.
I think this is a UI issue because:
1. There are 2 places you can modify table load order but one seems to have no effect.
2. I knew table order was listed somewhere but had trouble finding it.
3. You have to click on the mapping somewhere on the canvas to get to the mapping properties. This is however, completely different to the mapping properties you get by clicking on Mapping -> Properties from the menu
4. At no stage had I manually modified the table order before the problem occurred yet it was overriding the order that should have been determined by the FK constraints. -
Data Load Tables - Shared components
Hello All,
There is a section that is called Data Loading in Shared components in every application that says: A Data Load Table is an existing table in your schema that has been selected for use in the data loading process, to upload data. Use Data Load Tables to define tables for use in the Data Loading create page wizard.
The question is: How ca i select a table in my schema for use in the data loading process to upload data using the wizard?
There is a packaged application that is called Sample Data loading. That sample is use for specific tables right? I tried to change those tables for the ones that I want to use but I could not because I could not add the tables that I want to use....
Thank you.Hi,
The APEX version is Application Express 4.2.3.00.07.
The application sample for data loading it created the data loading entry in shared components by default. In that part, I don't have the option to create a new one for the table that I want to load the data. I tried to modify that entry putting the table that I want, but i couldn't, because the table that it has it's not editable.
I tried to modify the Data Loading page that the application sample created but I couldn't. I can't change the source for the table that I want.
I have created a workspace at apex.oracle.com. If you want I can give you credentials to help me please, but I need your email for create the user for you. Thank you.
Bernardo -
External Loader Table - Merging Data
Hi,
We have a legacy system that is Cobol based and it produces flat files that we ftp into a directory on our Oracle server.
I have created an external loader table, that has the location set to the specific files in that directory - currently holding about 5.6 million records.
The files contain patient data from mulitple sites, so I am trying to bring them all together into three different tables;
1) Patient_Id
2) Patient
3) Patient_Address
Patient_Id holds one global identifier for each patient (GUI).
I have written three merge statements, but the performance seems to be pretty shocking. An example of one is below:
--MERGE RECORDS INTO E2E_PATIENT WHERE SITE ID MATCHES THAT IN FILENAME
MERGE INTO E2e_Patient p
USING (SELECT p.Pat_Id,
m.Title,
m.Forenames First_Name,
m.Surname,
m.Gender,
Pkg_Private_E2e.Fnc_Convert_Mdx_Dob(m.Dob,
'YYYYMMDD') Dob
FROM Mdx.e_Mpieghg1 m
INNER JOIN Commons.t_Site_Codes Sc
ON Sc.Medax_Site_Code = m.Site
INNER JOIN E2e_Patient_Id p
ON p.Local_Pas_Id = m.Mpi
AND p.Site_Id = Sc.Siteid
AND p.Sorce = 'MDX') m
ON (p.Pat_Id = m.Pat_Id)
WHEN MATCHED THEN
UPDATE
SET p.Title = m.Title,
p.First_Name = m.First_Name,
p.Surname = m.Surname,
p.Gender = m.Gender,
p.Dob = m.Dob
WHEN NOT MATCHED THEN
INSERT
(Pat_Id,
Title,
First_Name,
Surname,
Gender,
Dob)
VALUES
(m.Pat_Id,
m.Title,
m.First_Name,
m.Surname,
m.Gender,
m.Dob);
COMMIT;If I run the merge on exactly the same data, it seems to be updating all records again - even though there are no changes - is this how the merge works?
Are there any other ways of increasing performance?
Regards
Mark
Edited by: Lloydy76 on 04-Feb-2011 04:07Dan's approach might be a better way, but if there are a fairly large number of records coming in that do not need to be updated or inserted, then another approach might be to add a predicate to the when matched clause to avoid the updates when nothing has changed. Something like:
WHEN MATCHED THEN
UPDATE
SET p.Title = m.Title,
p.First_Name = m.First_Name,
p.Surname = m.Surname,
p.Gender = m.Gender,
p.Dob = m.Dob
WHERE p.Title != m.Title or
p.First_Name != m.First_Name or
p.Surname != m.Surname or
p.Gender != m.Gender or
p.Dob != m.DobJohn -
Problem loading table in SQL server
Hi,
I'm trying to load a table in SQL server from another instance of SQL server.
I have defined the physical and and logical data stores and reverse engineered the models to retrieve the tables.
The target table was created manually..
If I try to run the interface i get the fololowing error
ODI-1227: Task SrcSet0 (Loading) fails on the source MICROSOFT_SQL_SERVER connection DATAWAREHOUSE.
Caused By: java.sql.SQLException: [FMWGEN][SQLServer JDBC Driver][SQLServer]Incorrect syntax near '<'.
at weblogic.jdbc.sqlserverbase.ddb_.b(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddb_.a(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddb9.b(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddb9.a(Unknown Source)
at weblogic.jdbc.sqlserver.tds.ddr.v(Unknown Source)
at weblogic.jdbc.sqlserver.tds.ddr.a(Unknown Source)
at weblogic.jdbc.sqlserver.tds.ddq.a(Unknown Source)
at weblogic.jdbc.sqlserver.tds.ddr.a(Unknown Source)
at weblogic.jdbc.sqlserver.ddj.m(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddel.e(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddel.a(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddde.a(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddel.v(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddel.r(Unknown Source)
at weblogic.jdbc.sqlserverbase.ddde.execute(Unknown Source)
at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
It is trying to run the following SQl and I'm not sure why it is trying to drop and create a view in the source system ? The interface that I'm running above has just a source to target mapping..
drop view <Undefined>.SQLDATAWH_DATAWAREHOUSEAccountDim
Any pointers will be helpful..
Thanks in advance...whirlpool wrote:
I think I selected MSSQL one.. but I donot have access to the server now..Is this the correct KM ?
If you have selected IKM MSSQL Incremental Update then it is the correct IKM to choose.
To use this IKM, the staging area must be on the same data server as the target.
What is the LKM selected ?
I right clicked on the Reverse-Engineering (RKM) models and imported all knowledge modules.. Is that how its done ?
It is fine.
is that the correct one...I donot understand why the interface is trying to drop and create a view in source system..
It depends on the KM selected . So first get the name of LKM and IKM used in interface. -
OBIEE execute stored procedure to load tables before running report
Hi..
I want to execute a stored procedure to load database tables before running a report in OBIEE .
I need to pass 2 parameters to the stored procedure which loads into tables.
In the Connection Pool --> Connection Script Tab --> Execute before query, I wrote the below query using the repository variables VAR1 & VAR2 to execute the
DECLARE VAR1 number; VAR2 number;
BEGIN
schema_name.package_name1.package_body('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;
schema_name.package_name2.package_body('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;
END;
I am receiving the following error to declare the schema_name.package_name
+++Administrator:2a0000:2a0004:----2010/06/21 14:29:00
-------------------- Sending query to database named ACBS-OCC (id: <<49419>>):
BEGIN schema_name.package_name1.package_body1('VALUE OF(VAR1', 'VALUE OF(VAR2'); COMMIT; schema_name.package_name2.package_body2('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;END;
+++Administrator:2a0000:2a0004:----2010/06/21 14:29:00
-------------------- Query Status: Query Failed: [nQSError: 16001] ODBC error state: S1000 code: 6550 message: [Oracle][ODBC][Ora]ORA-06550: line 1, column 7:
PLS-00201: identifier 'SCHEMA_NAME.PACKAGE_NAME1' must be declared
ORA-06550: line 1, column 7:
PL/SQL: Statement ignored
ORA-06550: line 1, column 93:
PLS-00201: identifier 'SCHEMA_NAME.PACKAGE_NAME2' must be declared
ORA-06550: line 1, column 93:
PL/SQL: Statement ignored.
[nQSError: 16015] SQL statement execution failed.
Please suggest how to declare and execute the stored procedure.
Thanks in advance.Hi,
I know that any Function / Procedure needs to be called using a EVALUATE function in OBIEE.
Thanks,
Vijay -
Error while loading table from flat file (.csv)
I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
SQL*Loader-552: insufficient privilege to open file
SQL*Loader-509: System error: The data is invalid.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
I believe that this is related to SQL * Loader error.
ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
Please suggest
Is it required that i need to place the flat file in Oracle Server System ??
Regards,
Ashoka BLHi
I am getting an error as well which is similar to that described above except that I get
SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
The difference is that Ashoka was getting
SQL*Loader-552: insufficient privilege to open file
and I get
SQL*Loader-553: file not found
The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
Also in the error message is
Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
Character Set WE8MSWIN1252 specified for all input.
Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
The systest and development databases are on the same box.
Any help would be much appreciated
Thanks
Edwin -
I did the following to create a csv file :
1) I created a EXCEL_FILE Data Server for the FILE technology
2) I created a EXCEL_FILE Physical Schema to declare the directory where the files are stored
3) I created a EXCEL_FILE Logical Schema
4) I associated the logical and physical for a context
5) I created a model based on a File Technology and a data store
6) I try to create an interface with a table source datastore and a File target datastore. ODI maps the two datastore with the IKM SQL Control Append and when I run the interface, it fails.
==> Can anyone explain to me how to load data to a file with ODI, please?
==> Why can't I get the IKM SQL to File Append
Thanks a lotFor a csv file, define it as in the File Technology.
Then in the Data Model, define the parameters of how
it it set up (headers, separators etc)
hen you will be able to use SQL to File append.
Don't forget to use a staging area different to
target, as the file technology is not capable of
hosting the staging tables (if any)Thanks for your answer.
==>> I don't knom how to set the headers parameters.
I used another RKM and LKM to load my excel file and it worked. -
Load table data to any server-client
Hi...
I have a requirement to load Cost Centre data from CSKS and CSKT tables to any Server u2013 Client. What is the best way to accomplish the same? Do i have to use RFC or just put the cost center into a file and load to another system?
Regards,
Aruna Nivetha.RHi,
Actually, it's a task more to your ABAP or BASIS team, but you can do it yourself as well. You can read more here:
http://help.sap.com/saphelp_47x200/helpdata/en/94/e2d63b8ad9c01ce10000000a11402f/frameset.htm
Another way, is to put this master data in a change request and transport it. You can do it via OKE6.
Regards,
Eli -
Payroll Conversions to load tables T558B,T558C, T5U8C -- Please Suggest
Hi All
We are currently trying to do data conversion at a client with 100,000 + Employees. We are trying load the tables
T558B - Payroll Periods
T558C - Payroll Account Transfer: Old Wage Types
T5U8C - Transfer external payroll results (USA)
for all the YTD taxes for the Employees.
We have designed LSMW's using Transaction code SM30 to load all the Employees. When we are loading using SM30, it is taking a very long time. Can anyone suggest, if this is the Best practice, or if there is an alternate way to load all these data into SAP without using SM30.
Please let me know
Thank you
Deepthi
Note: I have checked SAP Best practice and that document uses SM30 transaction recordingHi
We are facing the same problem can you please tell us how did you resolve
Venkat
Maybe you are looking for
-
Mid-2010 MacBook Pro - Irregular internet/wi-fi performance
Hey all. First, let me apologise if this is a repeat of old discussions. It seems people have been having a few connection issues with the latest MacBook Pro but I couldn't find the exact problem I'm having. So... ever since upgrading to a MacBook Pr
-
Dear S&OP community, I am getting following error while creating a planning ares in a newly installed sandbox. "Enter values for planning horizon From and planning horizon To for the storage time profile level". This what I did... 1) Created new attr
-
Formatting an XML string that has a date
Hi All - new to this forum! Not familiar with XML operations and seeking some advise. I have a field that contains XML text that has a date within that I want to extract and format appropriately for some reports. Getting to the date portion of the fi
-
FINANTIAL DOCUMENTS MISSING --- ORACLE LOGS FULL
Hi Experts, 9 documents are missing and when they were saved the system showed the message "update has been stopped". This was because of the Oracle LOGS (full). When Oracle problem was solved the documents were not found in SM13 to recovered. Any cl
-
GPS keeps dropping after KitKat (4.4) upgrade on Galaxy S3
Since my 4.4 upgrade last month, my GPS drops quite often - pretty much every time I use it. It seems to happen either with the delivered navigation or maps, as well as with Waze. I've tried pulling the battery - no luck. Any other suggestions?