Creat Job for LSMW (periodical data transfer)- idoc BLAORD
Hi,
We are trying to create job for LSMW (thru frame program for periodic data transfer) for idoc BLAORD (Purchasing contract)
When I am running the job that I created, the job is cancelled and the error is:
"Upload from front-end is not possible in batch mode.
Message no. /SAPDMC/LSMW_OBJ_070 020"
How can I correct that? so I will be able to run a job (the job mast run background?)
Thanks,
Tal Sasson
Hello,
I was also receiving the same message /SAPDMC/LSMW_OBJ_070 020 even though I specified in /SAPDMC/SAP_LSMW_INTERFACE the server path. The problem was that in LSMW->Specify Files you have to delete the definition of front-end files and define application server files.
That Flag File (Path and Name) (field /SAPDMC/LSOINP-FILENAME in program /SAPDMC/SAP_LSMW_INTERFACE) is kind of useless since it is ignored anyway and files defined in LSMW object are imported and they have to be read anyway if you use more than one file.
Regards,
Peter
Similar Messages
-
LSMW - Periodic Data Transfer - wildcards in filename
Hello,
I have the following problem: The file, which has to be periodically read in is specifed for identification date + time e.g. DATAFILE_200820071250.txt
In the step 'Specifiy Files ' I entered:
serverDATAFILE_*.txt.
In the step ' Frame Program for Periodic Data Transfer' I entered in 'Value for Wildcard (*)' the following interval: 0000000000000000 - 99999999999999999.
During execution now (as a job with variant or directly) I receive always the same error message (/SAPDMC/LSMW524): "Specified values for wildcard (*) have not been defined" .
How I have to define the filename with wildcard (*) of a file which is peridically read in by LSMW?????
Thanks for your help!
Atentamente le saluda / Mit freundlichen Grüssen / Yours sincerely
Nicole MiosgaHi Nicole,
As of my knowledge the way u have assigned the file name is incorrect,
just check it once, and in SPECIFY FILES there is values for wildcard ,
i want to know what u did here.
Thanks
Naveen khan -
ALE/iDoc for Work Center data transfer
Hello, community.
I am working on several interfaces for the PP and PM modules using iDocs/ALE.
One of the requirements is that only created and modified data must be distributed from R/3.
For Work Centers, we are using basic type LOIWCS02 and message type LOIWCS. We are generating iDocs through transaction POIM, which has two transfer modes: Complete (data for all Work Centers etc.) and Change (only new and updated data).
The Change Transfer Mode assumes you have set both Change Pointers and Change Documents for the objects to be transfered. Otherwise, it'll work just like the Complete Transfer Mode.
Our PP consultant told me there is no configuration to generate Change Documents for Work Center data modifications.
Can anybody give me other opinions? I'll resort to Z-table solutions only if I have no other options.Hi Danilo,
Do you know what the process codes are for
LOIWCS and LOIROU ?
Appreciate your help.
Thanks,
Gopi. -
Reg lsmw for customer master data transfer
Hello All,
I want to know while transfering data by lsmw for customer master data .
Scenario : There are more then one ship-to-party and bill to party so how to upload the data in this case?????
Thanks,
Sunnyhi
just refer to the link below
http://www.sapmaterial.com/?gclid=CN322K28t4sCFQ-WbgodSGbK2g
<b>step by step procedure with screen shots</b>
regards
vijay
<b>plz dont forget to reward points if helpful</b> -
Create values for analogous periods in the same table with DS
Hi,
in the course of a pilot we want to create additional columns with data for analogous periods.
We have values for a month and need to create new columns for values for last month and last year in order to compare.
Attached youu2019ll find a few sample rows resembling the basic concept of what we have
Region Country Year Month Value
Europe Spain 2009 1 285
Europe Spain 2009 2 129
Europe Spain 2009 3 153
Europe Spain 2008 1 288
Europe Spain 2008 2 284
Europe Spain 2008 3 125
Europe France 2009 1 135
Europe France 2009 2 125
Europe France 2009 3 284
Europe France 2008 1 208
Europe France 2008 2 221
Europe France 2008 3 220
In this case, new columns would be
LastMonth (125 for Spain month 1 year 2009, since this is the value for month 3 year 2008)
LastYear (288 for Spain month 1 year 2009, since this is the value for month 1 year 2008).
Can you help on what function to use in the DS query transformation?
thanks everybody, and enjoy!Hi David,
Probably the easiest way of resolving this would be to add extra columns which show the projected date each row should be shown on e.g.
Region Country Year Month Value LM_Year LM_Month LY_Year LY_Month
Europe France 2008 1 208 2008 2 2009 1
Europe France 2008 2 221 2008 3 2009 2
Europe France 2008 3 220 2008 4 2009 3
Europe France 2009 1 135 2009 2 2010 1
Europe France 2009 2 125 2009 3 2010 2
Europe France 2009 3 284 2009 4 2010 3
This would be written to a staging table then used as a source with the original source file to produce the target table. I've illustrated an example of how this could be achieved below:
Create two dataflows; DF 1 and 2.
In DF 1 read source, add additional columns to show the date (year, month, etc) this should be projected as. This would need to be done for both the last month and last year. Load staging table.
In DF 2 Read the source and outer join to the new staging table twice. Use the region, Country, Year, Month, to join and use the new additional projected date columns. Ensure the last month date fields are used on one staging table and last year date fields are used on the next staging table. The value from the two staging tables is then used in the last month and last year value.
Load into target table.
To calculate the projected date columns you would probably want to format the source fields into a date then add the required number of months. e.g. to_date('01' || month || year, 'DDMMYYYY') + 12
Obviously this would need the stage table to be truncated before each load and you would need to assess your load strategy. When doing these calculated fields if you have any retrospective updates on the source then these would then need recalculating, therefore this assumes it's a complete reload each time.
Edited by: Richard Saltern on Jul 16, 2010 2:01 PM
Edited by: Richard Saltern on Jul 19, 2010 10:10 AM -
Hello,
I'm looking for the ABAP program name of creating a job for MD03
Could anyone help with this? Or just tell me how could I create a job for md03?
Thank youDear,
Program SAPMM61X it is for MD01 MRp at plant level.
For Back ground use the RMMRP000
Please refer this link,
Re: MRP background job
Regards,
R.Brahmankar -
Need help in creating prompt for Month To Date Report.
<span class="postbody"><font size="2">Hi All <br />I need to create a Month To Date Report using month(Start date & End date) as prompts, By default it has to run on the previous month data or if user selects his own range of dates then it has to bring the data for that range of dates ,one more thing is i am not showing the month and date columns on the report.Can any one help me how to create prompt for this in DeskI XIR2</font></span>
Could you clarify some things? You are saying you are trying to transform the xml output of a webservice with an xsd, but an xsd is a schema that describes the allowed format of a xml file. If you want to transform xml from one format to an other (from one xsd to an other) you have to use a xsl file. You can make and test this with JDeveloper. Are you calling the webservice from esb or bpel?
Kind Regards,
Andre -
Creating Conditions for Purchase Info Records using IDoc COND_A01
Hi All,
we are using IDOC COND_A01 to update and add conditions in ME12 transactions.
But when we send a condition IDoc which contains 2 condition types (Segment KONP), the standard function call creates only one database entry in the table KONP. The entry in KONP is for standard condition types PB00, but not for company specific types (Say: ZABC) .But when we create them manually in ME12 maintaining an purchase info record we get both the entries in the KONP database table.
Has anyone come accross such a problem? Is there any special field in the IDoc COND_A01 which controls the creation of the condition types?
Thanks for any answer!U should use basic type INFREC01 instead of NFRECMASS01.
Within the FM there is a check for basic type like:
check idoc type
if f_idoc_control-idoctp <> c_idoctp_infrec01. " INFREC01
raise wrong_function_called.
endif.
Edited by: Joyjit Ghosh on Sep 15, 2008 4:57 PM -
For Init without data transfer showing records 0 from 0
Hi all,
Iam doing Init without data transfer to ODS. In the ODS manage it is showing 1 from 1 which is correct. This ODS data is automatically updated to a cube. There in the cube manage it is showing 0 from 0. My mappings in update rules also correct , but why it is not giving 1 from 1.
In the cube manage the status is showing green but in the monitor of cube always it is yellow.
Help me please
RamHi Ram
If you load data with INIT without data transfer then system shows the defualt record in the monitor.
If check that record in the PSA then you dont find it and on the cube as well.
If the load is green with 1 record then extractor working fine.It's not actaully a data reocrd.
Hope it helps...
Regards,
Chama. -
Source for Init with data transfer and full load
Hi Experts,
1. Please tell me where the data comes when we do the
following actions :-
a) Init with data transfer
b) Full load
i want to know the source of data i.e (setup table or main table or any other table)
Helpful if you can provide the data-flow.
kindly tell which is prefrable and why?
Regards,
RGHi.......
When you do init with data transfer it will read data from the set up table only..........but beside this it will also set the init flag..........due to this from next time any new records will go to the delta queue...........and next time when you run delta load it will pick records from delta queue...........
Now suppose data are alrady available in the BW side...........then you can run init without data transfer.......this will only set the init flag.........it will not pick any records.........
Extraction structure gets data from set up table in case of full upload ............and in case of delta load from delta queue.......
Regards,
Debjani...........
Edited by: Debjani Mukherjee on Sep 20, 2008 11:45 AM -
Regarding Batch Jobs for migration of data from client DB to our local DB
Hi Folks,
GoodDay, Previously I worked as a Oracle Developer and recently shifted into a BIG organisation as PL?SQL developer. Here my base work is with INTERFACES(Stored Procedure). That is retrieving data from client DB and need to put the data in our staging tables. For that i created 2 INTERFACES(stored procedures). One is for one time migration. No need to run on regular basis. The second one is to run regularly why because if any updations are done in client DB that should be replicate to our DB. For that we need to run a BATCH JOB regulary 24X7 basis.
But i dont have any working knowledge on BATCH JOB creation and running. Please let me guide on this issue ASAP.
Regards,
Ramesh.You can try
dbms_job(9i)
or
dbms_scheduler(10g)
just out of curiosity.. tell me if you are interested...
what you were doing as a "oracle developer" ( is it like writing only "sql" )... -
Create records for missing periods in transformation
I need to create some routine that creates fical periods missing with 0,00 as value in de key figure and the characteristics remain the same. It is possible to create in the transformation something like this? Should put it in the initial routine or in the end one?
Source
Company 1; Account 1; Fiscper 012009; KF 345 Eur
Company 1; Account 1; Fiscper 032009; KF 123 Eur
Company 1; Account 1; Fiscper 062009; KF 678 Eur
Company 1; Account 2; Fiscper 012009; KF 987 Eur
Final destination
Company 1; Account 1; Fiscper 01.2009; KF 345 Eur
Company 1; Account 1; Fiscper 02.2009; KF 0 Eur
Company 1; Account 1; Fiscper 03.2009; KF 123 Eur
Company 1; Account 1; Fiscper 04.2009; KF 0 Eur
Company 1; Account 1; Fiscper 05.2009; KF 0 Eur
Company 1; Account 1; Fiscper 06.2009; KF 678 Eur
Company 1; Account 1; Fiscper 07.2009; KF 0 Eur
Company 1; Account 2; Fiscper 01.2009; KF 987 Euryou are right, that's what I'm afraid of.
I already have a end routine:
PROGRAM trans_routine.
* CLASS routine DEFINITION
CLASS lcl_transform DEFINITION.
PUBLIC SECTION.
* Attributs
DATA:
p_check_master_data_exist
TYPE RSODSOCHECKONLY READ-ONLY,
*- Instance for getting request runtime attributs;
* Available information: Refer to methods of
* interface 'if_rsbk_request_admintab_view'
p_r_request
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.
PRIVATE SECTION.
TYPE-POOLS: rsd, rstr.
* Rule specific types
TYPES:
BEGIN OF _ty_s_TG_1,
* InfoObject: ZIGCHENTI Magn_ci - Entity.
/BIC/ZIGCHENTI TYPE /BIC/OIZIGCHENTI,
* InfoObject: ZIGCHPART Magn_ci - Partner.
/BIC/ZIGCHPART TYPE /BIC/OIZIGCHPART,
* InfoObject: ZIGCHCTSH Magn_ci - CtShare.
/BIC/ZIGCHCTSH TYPE /BIC/OIZIGCHCTSH,
* InfoObject: ZIGCHCURR Magn_ci - Currency.
/BIC/ZIGCHCURR TYPE /BIC/OIZIGCHCURR,
* InfoObject: ZIGCHZONE Magn_ci - ZONE.
/BIC/ZIGCHZONE TYPE /BIC/OIZIGCHZONE,
* InfoObject: 0FISCPER Exercício / período.
FISCPER TYPE /BI0/OIFISCPER,
* InfoObject: 0FISCVARNT Variante de exercício.
FISCVARNT TYPE /BI0/OIFISCVARNT,
* InfoObject: ZIGKFCSAM Magnitude - Consolidated Amount.
/BIC/ZIGKFCSAM TYPE /BIC/OIZIGKFCSAM,
* InfoObject: 0CURRENCY Código da moeda.
CURRENCY TYPE /BI0/OICURRENCY,
* InfoObject: 0FISCYEAR Exercício.
FISCYEAR TYPE /BI0/OIFISCYEAR,
* InfoObject: 0FISCPER3 Período contábil.
FISCPER3 TYPE /BI0/OIFISCPER3,
* InfoObject: ZIGCHAINT Magn_ci - Audit ID.
/BIC/ZIGCHAINT TYPE /BIC/OIZIGCHAINT,
* InfoObject: ZIGCHFINT Magn_ci - Fluxo.
/BIC/ZIGCHFINT TYPE /BIC/OIZIGCHFINT,
* InfoObject: ZIGCHGLAC Magn_ci - Conta.
/BIC/ZIGCHGLAC TYPE /BIC/OIZIGCHGLAC,
* InfoObject: ZIGCHCINT Magn_ci - Categoria.
/BIC/ZIGCHCINT TYPE /BIC/OIZIGCHCINT,
* InfoObject: ZIGCHPINT Magnitude - Perímetro de Consolidação.
/BIC/ZIGCHPINT TYPE /BIC/OIZIGCHPINT,
* InfoObject: ZIGCHCCUR Magn_ci - Consolidation Currency.
/BIC/ZIGCHCCUR TYPE /BIC/OIZIGCHCCUR,
* InfoObject: ZIGCHVINT Magnitude - Versão de Consolidação.
/BIC/ZIGCHVINT TYPE /BIC/OIZIGCHVINT,
* InfoObject: ZIGCHPROD Magn_ci - PROD.
/BIC/ZIGCHPROD TYPE /BIC/OIZIGCHPROD,
* InfoObject: ZIGCHDEST Magn_ci - Dest.
/BIC/ZIGCHDEST TYPE /BIC/OIZIGCHDEST,
* InfoObject: ZIGKFCSAC Magn - Cons.Amount - CURR.
/BIC/ZIGKFCSAC TYPE /BIC/OIZIGKFCSAC,
* InfoObject: ZIGKFCAIP Magn - Cons.Amount - IVA - PMP.
/BIC/ZIGKFCAIP TYPE /BIC/OIZIGKFCAIP,
* InfoObject: ZIGKFCAIR Magn - Cons.Amount - IVA - PMR.
/BIC/ZIGKFCAIR TYPE /BIC/OIZIGKFCAIR,
* InfoObject: ZIGKFVLM Dados Magnitude - Valor Mensal.
/BIC/ZIGKFVLM TYPE /BIC/OIZIGKFVLM,
* Field: RECORD.
RECORD TYPE RSARECORD,
END OF _ty_s_TG_1.
TYPES:
_ty_t_TG_1 TYPE STANDARD TABLE OF _ty_s_TG_1
WITH NON-UNIQUE DEFAULT KEY.
*$*$ begin of global - insert your declaration only below this line *-*
... "insert your code here
*$*$ end of global - insert your declaration only before this line *-*
METHODS
end_routine
IMPORTING
request type rsrequest
datapackid type rsdatapid
EXPORTING
monitor type rstr_ty_t_monitors
CHANGING
RESULT_PACKAGE type _ty_t_TG_1
RAISING
cx_rsrout_abort.
METHODS
inverse_end_routine
IMPORTING
i_th_fields_outbound TYPE rstran_t_field_inv
i_r_selset_outbound TYPE REF TO cl_rsmds_set
i_is_main_selection TYPE rs_bool
i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
i_r_universe_inbound TYPE REF TO cl_rsmds_universe
CHANGING
c_th_fields_inbound TYPE rstran_t_field_inv
c_r_selset_inbound TYPE REF TO cl_rsmds_set
c_exact TYPE rs_bool.
ENDCLASS. "routine DEFINITION
*$*$ begin of 2nd part global - insert your code only below this line *
... "insert your code here
*$*$ end of 2nd part global - insert your code only before this line *
* CLASS routine IMPLEMENTATION
CLASS lcl_transform IMPLEMENTATION.
* Method end_routine
* Calculation of result package via end routine
* Note: Update of target fields depends on rule assignment in
* transformation editor. Only fields that have a rule assigned,
* are updated to the data target.
* <-> result package
METHOD end_routine.
*=== Segments ===
FIELD-SYMBOLS:
<RESULT_FIELDS> TYPE _ty_s_TG_1.
DATA:
MONITOR_REC TYPE rstmonitor.
*$*$ begin of routine - insert your code only below this line *-*
DATA: wa_result_fim type _ty_s_TG_1.
DATA: l_tabix type sy-tabix.
CLEAR wa_result_fim.
l_tabix = 0.
SORT RESULT_PACKAGE BY /BIC/ZIGCHENTI /BIC/ZIGCHAINT /BIC/ZIGCHGLAC
/BIC/ZIGCHPINT
/BIC/ZIGCHVINT FISCPER.
LOOP AT RESULT_PACKAGE assigning <RESULT_FIELDS>.
IF sy-tabix = 1.
<RESULT_FIELDS>-/BIC/ZIGKFVLM = <RESULT_FIELDS>-/BIC/ZIGKFCSAM.
ELSE.
l_tabix = sy-tabix - 1.
READ TABLE RESULT_PACKAGE INDEX l_tabix INTO wa_result_fim.
IF sy-subrc EQ 0.
IF wa_result_fim-/BIC/ZIGCHENTI =
<RESULT_FIELDS>-/BIC/ZIGCHENTI
AND wa_result_fim-/BIC/ZIGCHAINT =
<RESULT_FIELDS>-/BIC/ZIGCHAINT
AND wa_result_fim-/BIC/ZIGCHGLAC =
<RESULT_FIELDS>-/BIC/ZIGCHGLAC
AND wa_result_fim-/BIC/ZIGCHPINT =
<RESULT_FIELDS>-/BIC/ZIGCHPINT
AND wa_result_fim-/BIC/ZIGCHVINT =
<RESULT_FIELDS>-/BIC/ZIGCHVINT
AND wa_result_fim-FISCYEAR = <RESULT_FIELDS>-FISCYEAR.
Before I do this calculation I want it to validate if exist the period if not should append it with 0,00 and them do the subtraction.
<RESULT_FIELDS>-/BIC/ZIGKFVLM =
<RESULT_FIELDS>-/BIC/ZIGKFCSAM -
wa_result_fim-/BIC/ZIGKFCSAM.
ELSE.
<RESULT_FIELDS>-/BIC/ZIGKFVLM =
<RESULT_FIELDS>-/BIC/ZIGKFCSAM.
ENDIF.
ELSE.
EXIT.
ENDIF.
ENDIF.
ENDLOOP.
*-- fill table "MONITOR" with values of structure "MONITOR_REC"
*- to make monitor entries
... "to cancel the update process
* raise exception type CX_RSROUT_ABORT.
*$*$ end of routine - insert your code only before this line *-*
ENDMETHOD. "end_routine
* Method inverse_end_routine
* This subroutine needs to be implemented only for direct access
* (for better performance) and for the Report/Report Interface
* (drill through).
* The inverse routine should transform a projection and
* a selection for the target to a projection and a selection
* for the source, respectively.
* If the implementation remains empty all fields are filled and
* all values are selected.
METHOD inverse_end_routine.
*$*$ begin of inverse routine - insert your code only below this line*-*
... "insert your code here
*$*$ end of inverse routine - insert your code only before this line *-*
ENDMETHOD. "inverse_end_routine
Edited by: João Piçarra on Aug 13, 2009 12:18 PM -
Looking for a better data-transfer mechanism
We are currently developing a rich client desktop application.
The client can talk to back-end server via the web service.
The web service is responsible for querying data from the back-end database (oracle).
And for function-like requests from clients, the web service simply submit a job to the oracle db.
The business logic resides in oracle stored procedure.
And the data is in the same database.
When the client request for the data that could make the database to retrieve 1,000 records out of, say 1,000,000 records.
what we do now is making multiple web service calls, each time for 100 records (in order to make the response data package small enough),
until no more could be retrieved.
In detail, we make all the queries appended with an order by clause,
and we send over the starting index along with the number of records retrieving,
and use JDBC result set (in the web service method) to re-position and get the needed data,
then use OracleCachedRowSet to hold the data and then send back to the client.
The problem here seemed to be significant.
Multiple web service calls are being wasted.
Each web service call will make the db run the query (in-efficiently) again, and wasted whole bunch of data.
In this querying scenario,
we do not want to separate the data in to different pages (like the table in traditional web-app),
and we want to know the meta data for the query result (our client application have to make use of the metas).
Due to our in-experienced background, we can not figure out an efficient mechanism for this querying scenario.
We thought socket-programming might work for us since once it's opened, the connection will be always be established,
and then we would not have to waste multiple db queries.
But socket can not by-pass the firewalls, and we would lost much the benefits introduced by the web service.
Is there an efficient way of doing this in the web service world?
Please enlighten us.
Any suggestion / criticism welcomed.
Thanks in advance.Yes, agree with SAB, zalman 7000 is a good one, however I´ve got 2 very simular K8N Neo2-systems but with 2 different cpu coolers.
One with zalman 7000alcu and one with Thermalright XP-90 and a zalman 92mm fan ( you can use what ever 80/92mm fan you like high speed, low speed, led colored, etc).
Both are equal regarding cooling to my experience, but I prefered the XP-90 because I did not have to remove the backplate ( I think zalman recommends mounting it´s own backplate to get the right distance/pressure between the cooler and cpu ).
With the XP-90 you just switch the retension mechanism which is easy ( even with the MoBo in the case ). -
APD Process for Cube - ODS data transfer.
hi,
Two data sources (cube and ODS) are feeding another ODS. I want to delete the data in Data target (ODS)belongs to that particular infosource for every data load if it comes from the same infosource without using process chain or events. (I want the similar option found in cube). I also want to write some complex functionality (ABAP CODE) in start routine of update rules.
Bhanu suggested me to use APD Process. Can any one (including BHANU) explain me in detail about APD Process. Please also give me navigations.
Regards,
AvneetHi Avneet,
You can read details about the APD here:
http://help.sap.com/saphelp_nw04/helpdata/en/49/7e960481916448b20134d471d36a6b/content.htm
Basically you can get to the APD from transaction RSANWB. Start creating a new process using a query as the source of data. You can provide a transactional ODS as a data target and you also have the opportunity of changing the data using predefined functions, or also ABAP code before loading into the ODS.
Hope this helps... -
Create JOB for Standard report
Hi All,
Can we create a job (se36) for slandered report and bring into MS outlook , is there any standerd procedure available or do we required a developments
Thanks
SDYES
Maybe you are looking for
-
Motif L&F: support for color schemes?
I want to use the Motif L&F in my Java application, but I want the color scheme to be chosen as per the users installed Color theme(using the CDE Style Manager). Having a default bluish grey UI does not blend well with my 'Desert' or 'Delphinium' col
-
How to change the valution category of the existing material
Dears Here is one problem I am facing for material selection in refurbishment process The sap pm is already implemented 2 years back ,Now I want to implement the refurbishment process Now the say 5 pumps are in stock with material code say xyz Now I
-
In BPC 5.1, I had a Data Manager menu option in BPC for Excel and now it has gone. Is there some criteria that must be satisfied for an Application to enable the Data Manager option? Or might I have switched Data Mnaager off somewhere? Any suggestion
-
What are the specific steps post Oracle Home & db move , besides ofcourse the necessary network and listener setup ?
-
Every time I try to update iTunes it errors out on me, why does it do that?
Any help would be appreciated.