Issue in creation of data from extended segment in background mode
Dear all,
I am facing an issue in creation of data in receiving system using IDOC for an xtended matmas segment(Extended for classification view in user exit).The issue is that the idoc is received successfully in r3eceiving system but not created successfuly in data base .(Though it is shown green/successful).When i execute this from we19 the data is created successfuly in data base.I will be thankful if you can guide me resolve issue.
Thanks.
Hello,
you can use CALL TRANSFORMATION id, which will create a exact "print" of the ABAP data into the XML.
If you need to change the structure of XML, you can alter your ABAP structure to match the requirements.
Of course you can create your own XSLT but that is not that easy to describe and nobody will do that for you around here. If you would like to start with XSLT, you´d better start the search.
Regards Otto
Similar Messages
-
Import data from MS ACCESS in background mode
Hello experts,
i am facing the following problem. I´ve got to import a MS ACCESS database in background mode via job.
I got a solution which works in dialoge:
create object conn 'ADODB.Connection'.
create object rsdb 'ADODB.Recordset'.
concatenate 'Provider=Microsoft.Jet.OLEDB.4.0;'
'Data Source=C:UserspfahlbeDesktopaccess.mdb'
into sql.
call method of conn 'Open'
exporting
#1 = sql.
Any idea how it can be handled in background?The system does not have permission to the useru2019s desktop C:\Users\pfahlbe\Desktop\access.mdb when running in the background.
Recommend moving your MSAccess db to a network location where the system has full-time read-permission, and adjusting your sql accordingly.
Regards,
zKen -
PDF Creation with data from SAP system
Hi All,
I need to generate a PDF file using Adobe Document Service. The content of the PDF can be from any datasource (Oracle/ BW/ R3 ), So I require Webdynpro code for pdf creation with data from any of the systemHi Deepak,
Before starting the code just like that,make sure u have configured all services in Visual Administrator.
U can refer this:
<a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/95/5a08cd0e274a0bae559622d6670722/frameset.htm">Configuration Guide</a>
regards
Sumit -
How to get data from different SEGMENTs
Hi;
My requirement is to first of all retrieve the data from different segments in run time.
I mean, i have to execute t/code WE19 and the idoc will be given after that an user exit would called where i have to retrieve the data from different segments and then it would store in an internal table for further calculation.
Can you all please guide me how to retrieve the data via ABAP code in runtime so that it can store in an internal table.
Regards
Shashiwrite code lke this ..
form post_idoc tables idoc_data structure edidd
idoc_status structure bdidocstat .
loop at idoc_data .
case idoc_data-segnam .
when 'zsegment1'.
move idoc_data-sdata to wa_zsegment1 .
when 'zsegment2'.
move idoc_data-sdata to wa_zsegment2 .
when 'zsegmentn'.
move idoc_data-sdata to wa_zsegmentn.
endcase .
endloop.
After this write code to move data to int table from work area .
endform .
Hope this helps u.... -
Creation/modified date from files
Hi all,
I work with Forms6i on a 8.1.6 database.
In my Application I copy/move files from e.g. C: to an folder on the server. Now I will check, if the application really had copy/move the file. I would do this with the creation/modified date from the files. (e.g. If the modified date on the server is the same as the modified date from C:, then the application had really copied.)
How can I find out the creation/modified date from the file in Forms6i. Which package can I use?
Thanks. Sorry, for my english.
Regards
SandraIn Client server forms you can use D2KWutil (See the WIN_API_FILE package).
http://www.oracle.com/technology/software/products/forms/files/d2kwutil/d2kwutil_6_0_6_0.zip -
Issue in transfer of data from ECC to APO
Hi All,
I have a requirement of transferring data from ECC to APO. I am using EXIT_SAPLCMAT_001 fro this purpose. The problem is, I need to transfer the data of a field that is not present in cif_matloc but present in /sapapo/matloc.
How should I proceed...Please help....this is an urgent requirement
Thanks & Regards,
SriLalithaHi,
you may want to go to the transaction /SAPAPO/SNP_SFT_PROF
Determine Forecast of Replenishment Lead Time
Use
In this field, you specify how the extended safety stock planning determines
the forecast of the replenishment
lead time (RLT). The following values are available:
Supply Chain
The system determines the RLT forecast using the supply chain structure by
adding the corresponding production, transportation, goods receipt, and goods
issue times. If there are alternative procurement options, the system always
takes the
longest
option into account.
Master Data
The system determines the RLT forecast from the location product master
data.
Master Data/ Supply Chain
First, the system determines the RLT forecast from the location product
master data. If no RLT forecast can be determined, the system determines the
forecast using the supply chain structure (as described under
Supply
Chain
Dependencies
You can retrieve the replenishment lead time forecast yourself by using the
GET_LEADTIME
method of the Business Add-In (BAdI) /SAPAPO/SNP_ADV_SFT.
Replenishment Lead Time in Calendar Days
Number of calendar days needed to obtain the product, including its
components, through in-house
production or external
procurement.
Use
The replenishment lead time (RLT) is used in the enhanced methods of safety
stock planning in Supply Network Planning (SNP). The goal of safety
stock planning is to comply with the specified service level, in order
to be prepared for unforeseen demand that may arise during the replenishment
lead time. The longer the RLT, the higher the planned safety stock level.
Dependencies
The field is taken into account by the system only if you have specified
master data or master data/supply chain in the RLT: Determine
Forecast field of the safety stock planning profile used.
Hope this helps.
The RLT from ECC is in MARC-WZEIT which is transferred to APO in structure /SAPAPO/MATIO field CHKHOR.
May be if you maintain the setting in the profile, you may get the value in RELDT.
Thanks. -
Issue when uploading Sales data from DSO to Cube.
Dear All,
I have an issue when I am uploading Sales data from DSO to Cube. I am using BI 7.0 and I have uploaded all sales document level data to my DSO. Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube. Cube has customer wise aggregation data.
In DSO I have NetPrice(KF) and Delivered_QTY(KF). I do a simple multiplication routine in the transformation from DSO to Cube.
RESULT = SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
Can someone please help me.
ShankaHi,
are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
but first verify if other key figures are also having the same issue.
Thanks
Ajeet -
Issue while loading master data from BI7 to BPC
Dear Experts,
I'm trying to load master data from BI7 to BPC NW using scenario 2 mentioned in the below document.
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00380440-010b-2c10-70a1-e0b431255827
My requirement is need to load 0GL_ACCOUNT attribute and text data from BI7 system to BPC.
1.As mentioned in the How to...doc I had created a dimension called GL_ACCOUNT using BPC Admin client .
2.Able to see GL_ACCOUNT in RSA1, when I try to create a Transformation(step 17 , page-40) to load Attribute data I could not find source of transformation object as 0GL_ACCOUNT(which exist in BI7) . I could only able to see only dimensions available in BPC system when I click F4 in Name.
What could be the reason to not getting BI infoobject as source in BPC?
Thanks in advance...
regards,
RajuDear Gurus,
My issue got resolved. So far I'm trying to pull data from R/3>BW>BPC. In the existing land scape BW and BPC are 2 different boxes. That is the reason I couldn't able to see BW objects into BPC (since 2 are different boxes). To resolve the issue I have created a new infoobect (RSD1) in BPC and data loading is from R/3>BPC infoobject(which is created through RSD1)>BPC Dimension.
Thanks and regards,
Raju -
Authorization issue when I display data from ODS, Infocube, Multirprovider
Hi Experts,
When I'm trying to display data for ODS, Infocube, Multiprovider, Infoset in production system, facing aurhorization issue.
Can anybody have idea what is authorization objects to display data from Infoproviders.
SIRICheck for below authoriztions in your role:
S_RS_ICUBE
ACTVT 03
RSICUBEOBJ AGGREGATE, CHAVLREL, DATA, DATASLICE, DEFINITION, EXPORTISRC, UPDATERULE
RSINFOAREA *
RSINFOCUBE <your cubes>
S_RS_ODSO
ACTVT 03
RSINFOAREA *
RSODSOBJ <your DSOs>
RSODSPART DATA, DEFINITION -
Issue while loading of data from DSO to InfoCube
Hi Experts,
Can you tell me what might root casue if data is coming into DSO from R3 its correct and fine as required but while loading it to InfoCube from DSO its showing wrong data like some of Line Items that were closed were shown open in Cube AND also KF values were not right
Also there is no Routine code involved b/w DSO and InfoCube.
Thanks in adv .
NPHope you didnt delete some req from DSO without deleting change log . This might cause inconsistency.
If so , delete data from dso by right click delete data and reload . -
Issue while loading master data from BI to BPC
Dear All,
I am loading master data from BI to BPC using the process chain /CPMB/IMPORT_IOBJ_MASTER. I ran the package and package status was success. But few member ID's which are available in BI are not getting loaded in BPC. Nearly I have 1300 ID's in BI. Out of 1300,only 1270 ID's are loaded in BPC.
I haven't restrict any ID's using conversion file as well.
What could be the reason? How can I overcome this and load all member's ID in BPC?
Thanks & Regards,
Ramesh.Ramesh,
Whats the write mode you have chosen for the CPMB/IMPORT_IOBJ_MASTER chain ?
Check if you have the Update mode ..
Hope this helps... -
Acquire date from agilent dso3104 in avraging mode
hi
i am trying to acquire data from agilent dso 3104 x in labview . when i configure my dso in normal mode it is acquiring data but when i configure my dso in avraging mode it is not acquiring data on lab viewI am running labview code for DSO "Agilent 2000 3000 X series" (downloaded from http://forums.ni.com/ni/attachments/ni/140/47325/1/agilent_2000_3000_x-series.zip). VI examples: "acquire waveform single" / "acquire waveform continuously", but it only acquires in normal mode. whenever i change to average mode manually or through "configure acquisition" VI, i get the same error message: No data for operation on PC as well as oscilloscope and Query unterminated.
Pl find attached the error message.
If i run "read current waveform" VI , even then , averaging mode is not being supported.
I have not changed the default time out values in the default VI's or sub VI's. Pl advise on what values to take for sampling in 5Gsa/sec and averaging mode in 256 samples DSO setup.
Attachments:
error.jpg 40 KB -
XLS from Application server in background mode to internal table
Hi,
I need to transfer the content of an excel file in the application server to an internal table while running my program in background. My file is xls, cant use csv. For dataset bin mode, I need to know how to transfer the data to my internal table, already searched on forum but didnt find answer, at least not for background..
ow, I am using v6 here
about the ALSM_EXCEL_TO_INTERNAL_TABLE, I get the upload_ole error all the time, and with TEXT_CONVERT_XLS_TO_SAP, conversion_failed all the time too...
here is the code, just the load part and data declaration:
REPORT zbeto.
TYPE-POOLS: truxs.
TYPES:
BEGIN OF y_cot,
text1(12) TYPE c,
text2(12) TYPE c,
text3(12) TYPE c,
text4(12) TYPE c,
text5(12) TYPE c,
END OF y_cot.
DATA: it_datatab TYPE STANDARD TABLE OF y_cot,
wa_datatab TYPE y_cot,
it_raw TYPE truxs_t_text_data.
DATA:
v_file TYPE rlgrap-filename,
begin_col TYPE i VALUE '1',
begin_row TYPE i VALUE '2',
end_col TYPE i VALUE '5',
end_row TYPE i VALUE '102',
t_ctmp TYPE y_cot OCCURS 0 WITH HEADER LINE,
t_xls TYPE alsmex_tabline OCCURS 0 WITH HEADER LINE.
START-OF-SELECTION.
PERFORM f_load_xls.
FORM : f_load_xls
FORM f_load_xls.
v_file = '
ZSAPDEV\SAPDEVINTERF$\COTACAO\TESTE.XLS'.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING
filename = v_file
i_begin_col = begin_col
i_begin_row = begin_row
i_end_col = end_col
i_end_row = end_row
TABLES
intern = t_xls
EXCEPTIONS
inconsistent_parameters = 1
upload_ole = 2
OTHERS = 3.
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
I_FIELD_SEPERATOR =
i_line_header = 'X'
i_tab_raw_data = it_raw " WORK TABLE
i_filename = v_file
TABLES
i_tab_converted_data = it_datatab[] "ACTUAL DATA
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
t_ctmp[] = it_datatab[].
ENDFORM. "f_load_xls
both FM arent working, and when I try to execute in background mode, before anything, I recieve the following error:
Message text:
Error during import of clipboard contents
Message class:
ALSMEX
Message no.:
037
Message type:
A
and without the ASLM FM, the TEXT_CONVERT give me the conversion_failed exception...
the directories are right, that I am sure, because I just copy&paste it from CG3Z/CG3Y when I uploaded and downloaded to
check the file in the server...
am I missing something?
thanks again,
Roberto Macedo
(PS: I made another topic because none replied the other in 4 days and wasnt solved yet)Hi!
You didn't find answer for this, because it is not possible. If you run your program in background, it is running on the server, and does not have any connection to your local machine. That's why you can't upload/download in background mode.
You might try to address somehow your local PC, with its IP or MAC address, but I don't think does this task worth so much time.
Run your program in online mode, or if you want to run it in background, then upload your file into the SAP server.
Regards
Tamá -
Urgent- Issues in Creation of Web Service extending Portal Service
Hi Experts,
I am facing following issues,
1.I've created Portal Service and extended it as Web Service, this web service I am not able to view in Webservice Navigator.
2.When I am trying to send request to WS from EP Web Service checker in NWDS I am getting an error that "The User Authentification is not correct to access to the Portal Service com.sap.portal.prt.soap.PortalUMEService or the service was not found"
Any Help from you is highly appreciated.
Regards,
Maruti Prasadhi Marti ,
I have few pdf Reg this i did sample WSDl and tested its workingf fine
plese send me e mail i will send you
Regards ,
venkat -
I have a Primary DPM server that is protecting various File servers and a Exchange DAG. Everything is working as it should on this server, but I am experiencing constant issues on the secondary DPM protecting these same data sources.
On the primary, two Protection Groups in particular are set up to protect two different volumes on the same file server. Volume D:\ is 46TB in size, with a deduplicated file size of 22TB (actual 39TB) whilst Volume E:\ is 25 TB in size, with a deduplicated
file size of 2.5TB (actual 5.5TB).
Issue 1:
As expected Volume D: was initially replicated to the secondary DPM server at its undeduplicated size ( 39TB), as was Volume E:\ with initial replication to the secondary of 5.5 TB. So when is Microsoft going to support dedup on a secondary DPM ? It
seems daft to support dedup on the Primary DPM which is always more likely to be close to the original datasources on the LAN and not on the secondary DPM which is most likely to be placed offsite on a WAN !
Issue 2:
I have a big issue with the subsequent Synchronizations on the secondary server for E: which seems to transfer almost the full 5.5 TB every day ( I have it set to sync every 24hrs) - although the data is fairly static (i.e.unchanging) on that volume. On
one occasion a Sync continued for over 48 hours for this volume and had transferred over 20GB (according to the DPM console) , until I manually cancelled the job - how is that possible on a volume with only 5.5 TB (UN-deduplicated) ?? What is going on here
- has anyone any ideas?
Issue 3:
Another File server I am protecting on both the Primary & Secondary DPM server - always fails over to a consistency check on the secondary server - usually due to the fact it cannot access a particular file which results in an inconsistent replica .
However the Sync (and subsequent restore point) on the primary DPM server from the original datasource is always completes without any issues. Again, has anyone any clues ?
I do get the impression that the whole Secondary DPM thing is not quite robust enough. I can only assume that as the Primary seems to protect the original datasources ok, that the issue is with the secondary reading the information on the primary DPM.I tried changing the time of the synchronization, but that didn't help.
Meanwhile, I was working on another unrelated case with Microsoft and so I didn't want to have a second case open at the same time. So I waited for some weeks with no change on this problem. Then, about a day or two before I was finally going to call Microsoft
to open a case (months after the problem had started), the problem suddenly resolved itself, with no input from me! So I don't really know if it was time that fixed it eventually or what. Sorry I can't be of more help.
Rob
Maybe you are looking for
-
Hey everyone! For the past couple years, I've been using an Intel Mac mini which I was able to afford before rough times hit for me. It has been a champ for a long time, until recently. In the past week, the clock would lose time ever so often, and l
-
I synced my iPhone 5S to the wrong backup, and now there is an error?
The error message says that the iPhone file backed up on my computer is either corrupt or is not compatible with the phone I want to back it up to. I have now synced my iPhone 4 and my new 5S to the wrong backup, and lost about 2 years of contacts an
-
Does FCE compress HD video?
I'm working on some projects shot on a Canon HV20. A friend of mine who uses FCP said FCE doesn't compress HD video so it will consume more drive space than it would using FCP. Can anyone shed some light on this? I wasn't successful finding any info
-
how do we calculate the interst on a vendor invoice. please help. i will assign points on any inputs
-
Do I need FC Pro for custom frame sizes?
I recently purchased Final Cut Express HD for editing computer based software training videos. These videos are captured using Snapz Pro X2 at 1024x768 and will be played back on a computer at the same size, or reduced to 800x600. They won't be outpu