FTP Transport Method in Data Services SAP Applications Datastore
Hi,
We have SAP ECC and we are trying to install Rapidmarts for SD and Finance.When we try to run the Rapidmart Job.. We are getting the error
Error opening the file c:\temp\billingblock.daa
We are using FTP transport method as our SAP servers are Unix and DSE server is on Windows. we have an FTP server which is on Linux.
Here are the parameters iam trying to give.
Data Transfer MEthod : FTP
Working Directory on SAP Server : /DEV/Test/ -- Directory on FTP server on Linux
Local Directory : C:\temp
Global directory : C:\temp
FTP relative path to SAP working directory : /DEV/Test/
FTP HOST Name : IP Address(FTP IP Address)
User ID: :testftp
Password : : XXXXXX
When i try to run the job it's throwing an error. UserID and Password for FTP which iam using has all rights on the FTP Server.
let me know if i have to look something else.
Thanks,
Vinod
Hi Vinod,
Did you find a way to make it work ?
We have the same issue here...
Server #1 - Unix ECC
Server #2 - FTP Server
Server #3 - DataService server. (WINDOWS)
The .dat file are created on ECC server in the working directory, and a third party tool detect that directory content change and start upload the file to the FTP server.
The problem is when ECC server is done creating the .dat, I think it send the answer to DataService that the job is done. Then DS try to go to the FTP but the file is not there yet...
Can ECC do the FTP upload and then send the command to DS tha the job is done ?
Best regards,
David
Similar Messages
-
Data Services / SAP / ABAP
Hi,
i would like to use a var for filter, in an ABAP component
my var is a string, that i use for filter like this :
KNUMV IN ($MYVAR)
with myvar = '0001820242','0001820248','0001820250','0001820253','0001820254','0001820257','0001820258','0001820260','0001820263','0001820264','0001820265','0001820269','0001820270','0001820271','0001820275','0001820276','0001820278','0001820279','0001820280','0001820287','0001820288','0001820295','0001820297','0001820298','0001820301','0001820302','0001820303','0001820308','0001820309','0001820310','0001820312','0001820313','0001820314','0001820316','0001820317','0001820322','0001820326','0001820329','0001820330','0001820342','0001820346','0001820348','0001820350','0001820351','0001820357','0001820380','0001820382','0001820383','0001820384','0001820390','0001820391','0001820392','0001820393','0001820406'
I have an error that say : Literals that take up more than one line are not permitted.>.
Someone would have an idea plz ?
thxHI Raghu,
1)ORACLE 10G>NON SAP Universe>WEBI Report>Live Office> XCelsius
Here we can make use of the background scheduling of webi reports and use latest instance if the volume of data is more in the tables.
As webi is a very good tool we can consolidate data and optimise the performence of the dashboard..
Note: Universe designing should be done very carefully.
2))Oracle 10g/SAP R/3 (Using of Data Services)>ABAP Function Module/Infoset>Crystal Reports2008>Live Office> XCelsius.
SAP R/3>function module/Infoset>Crystal repotrs 2008->liveoffice>dashboard.
Here we can make use of the background scheduling of Crystal reports and use latest instance if the volume of data is more in the tables.
it is good if you implement all the logic in ABAP side and simplify the work at Crystal report side so that the dashboard performence is good if the refresh option is ONDEMAND.
Note: ABAP consultant is needed.
3)Oracle 10G>Crystal Reports2008>Live Office-->XCelsius
In this case you have to implement all you logic in Crystal reports and schedule the data and use latest instance for better performence of the dashboard.
Decision has to be made depending on the volume of data logic behind report to display the final data..
if you are using webi in ur flow you can have better staging and adhoc analysis can be given to the user .
Case 2 is also good if ur making every thing at ABAP level .
i dnt suggest 3rd case.
Hope this helps you ..
@sri -
Intermittent FTP transport method failure
I am receiving an intermittent failure on the R/3 DataFLow transport using FTP to a Linux job server. The file is terminated prematurely, and the job then fails. When re-executing the job for the same time interval (assuming that the same data is in the transport file), the job runs successfully.
What are the switches on the FTP request initiated by the job server?
Could this be due to a FTP server setting on the SAP host?
Could concurrent parallel FTP requests for transport files be causing this (the job runs DF's in parallel)?
Thanks for any response... -rs-Hi Vinod,
Did you find a way to make it work ?
We have the same issue here...
Server #1 - Unix ECC
Server #2 - FTP Server
Server #3 - DataService server. (WINDOWS)
The .dat file are created on ECC server in the working directory, and a third party tool detect that directory content change and start upload the file to the FTP server.
The problem is when ECC server is done creating the .dat, I think it send the answer to DataService that the job is done. Then DS try to go to the FTP but the file is not there yet...
Can ECC do the FTP upload and then send the command to DS tha the job is done ?
Best regards,
David -
Loading data from SAP ECC to DS---Error
Hi all,
I am using Abap dataflow for loading data from a table in SAP Application to oracle database. I am getting an error during execution as follows.
I am using direct download method for transfer of data.need your valuable inputs.Hi phaneendranadh kandula,
Direct Download Method: It will directly transfer the data from SAP Application Server to Client Download Directory.
Note: It is not recommended because we cannot schedule job and not recommended for large amount of data.
Data Transfer Methods.
Data Services with SAP Direct Download Data Transfer Method
Not Recommended
We cannot Schedule
Not recommended for Large Amount of Data
Data Services with SAP Shared Directory Data Transfer Method
Recommended
Secure Method
Can Handel Large amount of data
can be executed in Background method
Data Services with SAP FTP Data Transfer Method
Recommended
Secure Method
Applicable in Multiple OS Environment
Data Services with SAP Custom Transfer Method
Recommended
Highly Secure -
Exposure of BI Content/Generic DataSources in Data Services XI 3.x
We were told by SAP Business Objects personnel back in May 2009 that SAP DataSources (standard content and generic) would be exposed in SAP Business Objects Data Services XI 3.2. We have recently upgraded Data Services, in a sandbox environment, to XI 3.2 and found that this isn't there.
After pinging some others that we know at SAP, we're now hearing that this won't be available until SP4. Is this correct? Right now, the unrestricted shipment version is SP0 and we're wondering if it truly is SP4 or SP0 FP4 when then is scheduled to be in unrestricted shipment and when these are scheduled to be released for general customer consumption. Can anyone from SAP monitoring this forum comment?Hi Dennis:
BW uses the business content extractors. In 4.0 Data Services will also use the Business Content extractors.
This does not mean BW customers have to start using Data Services; SAP is not replacing the connectivity between ECC and BW. Data Services is added as an option to the landscape and adds the option of accessing other 3rd party sources u2013 one single tool to extract data.
With Data Services, you have the possibility to consolidate extraction rules into one platform. You can also make sure you can load correct and clean data into target application with the data quality components of Data Services.
You just need to know the name of the extractor and connect. No ABAP programs are generated, making it easier to deploy and maintain.
Since you originally posted your question almost two years ago you might be aware of this but I think people visiting this forum in the future may find very useful information on the blog by Tammy Powlas.
SAP BusinessObjects Data Services - What is new in 4.0?
/people/tammy.powlas3/blog/2011/04/18/sap-businessobjects-data-services--what-is-new-in-40
Regards,
Francisco Milán.
Edited by: Francisco Milan on May 24, 2011 4:12 PM -
Odbc connectivity with Data services 4.2
Hi guys
i have very general question to ask...
presently i have dataflow which has one source table connects to query transform and further query transform connects with target table.. query transform has no filter, no order, no group by ... it is simple select * from source table ......... company where the source tables resides have given us their odbc driver to connect and extract same data... but it takes very long time to copy at our end.. i admit that data is more than 100 million (takes more than 4 hrs) records.. but could you please help me to debug what can be done to increase the performance and how can i check why it takes very long time...
ODBC connection set up in administrative tools->ODBC application.. is there any other way to set up the odbc....
when i run same query on the tool provided by them to connect with LIVE system.. it takes half of the time taken by data services to copy..
I have tried changing the dataflow properties between in-memory/ pageable with degree of parallelism.
what can i check on my end if anything wrong going at my end... before i tell anything to source company to see...
I would be very greatful if anyone can help me to understand same.
regardsHi,
If your database is SQL server, install a SQL native client on the DS machine to connect to the DB. You don't need a ODBC for connection of DS with DB.
Check this, assuming your DB is SQL.
Can Data Services 4.1 Datastore connect to 32-bit MSSQL 2005 DB?
SAP Data Services 4.1 - SQL Server Native Client issue
Arun -
Specifying the filename for outbound FTP transport in OSB
I want an OSB based service that will send via FTP a file with a specific name to a remote ftp location.
It appears OSB can't do this because the FTP transport on a business service only allows you to specify the prefix and suffix and then generates a big long file name for the midlle parts.
Does someone know how to override this and specify the name that you want teh remote file name to be called?mdsrobbins wrote:
Your response was helpful but being new to OSB I'm still not quite there because I want to pass a "variable" to the file name. Now given I want to ftp a CSV file I've specified a proxy messaging service in an MFL format behind a jms queue. The business service underlying this is the ftp service again using the MFL format. This all works but in the message flow I for the proxy I can't seem to get access to a user defined property which contains the filename which I could then pull out and stuff into the "filename" header as suggested above.
Doe anyone have any ideas how I can get access to a filename property from a JMS message?From what i understand
JMSQ--- JMS PRoxy (MFL)----Pipeline ----->BS (MFL/FTP)---FTP server
So you would like to set the filename that is contained the JMS message custom headers?. While creating your JMS Proxy you have to follow these steps
1) While creating JMS Proxy in Transport Configuration page select Get All Headers =Yes
2) If file name is set as user defined property in JMS message then use $Header in you pipeline/message flow. This variable will hold your user defined properties. If the user defined property is part of JMS message then use $body.
thanks
Manoj -
Flex Data Services JOTM, JTA and JMX downloads
I am not a Java developer. I am an Information Architect/UI
Designer and up until a few months ago, I designed GUIs using MS
Studio.Net -- Tomcat is a little different IIS, to say the least.
=)
I have downloaded the trial software of Flex Builder 2 and
Flex Data Services to test it with intent to purchase.
I was getting real happy with Flex Builder 2. Breezing
through the "Getting Started" and all the tutorials. Absolutely
Love it.
Then along comes Flex Data Services. This application has
bought me to a screeching halt.
I've downloaded the files according to the instructions at:
http://www.adobe.com/support/documentation/en/flex/2/install.html#flexj2ee.
I placed the files in my webapps directory running on tomcat.
However, according to the instructions at:
http://www.adobe.com/support/documentation/en/flex/2/install.html#tomcat,
I need to download additional applications in order to
correctly use FDS. I have tried to download the JOTM 2.0.10.tgz
from
http://forge.objectweb.org/project/download.php?group_id=19&file_id=3926
without success. When I try to unzip the file (I am running
Windows XP Professional), I get the following error:
"Error reading header after processing 0 entries."
I've tried to download three different versions of JOTM, no
luck. Our Java developer suggested that I save the "tar" file as a
"zip" file and try to open it after download. However, when I try
to open the archive in WinZip it returns this error:
"Cannot open file: it does not appear to be a valid archive.
If you downloaded this file, try downloading the file again."
Both of these errors occur when trying to open the downloaded
JOTM 2.0.10.tgz and JOTM 2.0.8.tgz files.
How can I get these applications? Has anyone had any success
in downloading these files from:
http://forge.objectweb.org/project/showfiles.php?group_id=19&release_id=1024
The downloaded archive files are empty. I can not run any of
the Data Service samples.
When I tried at:
http://localhost:8080/samples/dataservice/flexcab/flexcabDispatcher.mxml
A service error prompt returns: "Unable to access
UserTransaction in Dataservice".
I'm assuming this is because the JOTM (et. al) is not on my
box, because the archive is empty when I try to open it in WinZip.
Also, while I was reading the installation instructions for
Tomcat at:
http://www.adobe.com/support/documentation/en/flex/2/install.html#tomcat
it seems pretty complicated for a non-Java programmer to
understand. I asked one of our 6-year veteran Java developers here
to help me with this and he said it was a pretty a complicated
process for someone w/o Java programming experience. I thought it
was geared towards front-end developers. But he did say that if I
can't download the JOTM, I won't be able to do whatever it is that
Flex Data Services is supposed to do. I say the latter because I am
still in the "Getting Started" manual going through the last of the
tutorials (which is the Flex Data Services distributed application
tutorial) before I start reading "Using Flex", so I'm not really
sure what lessons I will be doing that require FDS to work
correctly on my box.
Do I really need the FDS to build interactive applications in
Flex Builder?
I appreciate your answers. Thanks in advance.I agree with you about the process of getting FDS installed
and working. I recommend that you try Christophe Coenraets FDMS
tutorial. The JOTM files are in "tar" format, and so I think you'll
have to extract them using a "tar" utility. This is included with
any Unix variant, and so I asked someone at work to extract the
files; on Windows I think you'll have to find a free "tar" program.
I downloaded jotm-2.0.10.tar from the JOTM Sourceforge. Use
parameters "-xf" if the tar file is not compressed, or "-zxf" if it
is compressed. Good luck! -
Master data services - Excel addin integration with Sharepoint
Hi Gurus,
I am looking for different ways to present Excel addin plugin to user community. Can sharepoint be used to launch the excel within the portal. I am not sure if this is possible where a user can launch the excel directly from sharepoint website
where the excel opens with in the website.
It would be great to integrate with shareopoint, as it would allow users to go to one centralized location to perform MDS related tasks.The excel UI can be only hosted in the Excel desktop UI.
The webUI is possible to hosted in the sharepoint.
http://social.technet.microsoft.com/wiki/contents/articles/5734.sharepoint-2010-display-the-master-data-services-web-application.aspx -
Flex Data Services tutorial error
In the "Before You Begin" section of the Flex Data Services
tutorial in "Getting Started with Flex", the first bullet is
incorrect. It text should be:
"Ensure that you have installed the Flex Data Services Beta 3
release and that you can run the applications in the samples web
application."
The installation instructions are located here:
http://www.adobe.com/go/flex2_installation
The tutorial zip file is located here:
http://www.macromedia.com/go/flex2beta1_quickstart_tutorial_zip
- Mike Peterson
Adobe Flex documentation teamHi, I found some more "issues" with these data services
tutorial.
First of working with the notes example at first I couldn't
get the two browsers to talk to eachother until I found this
comment by funk_sf on the livedocs:
quote:
after some searching, I located a link to the tutorial.zip
here:
http://www.macromedia.com/go/flex2beta1_quickstart_tutorial_zip.
i unzip'd the contents into my samples directory so that I had the
following path: C:\fds2\jrun4\servers\default\samples\tutorials
the xml file in WEB-INF for the standalone flexbuilder 2b3 is
located at C:\fds2\resources\config as mentioned in the comments on
the previous page (
http://livedocs.macromedia.com/labs/1/flex20beta3/00000129.html)
as for creating the tutorial1.mxml file, i created a new Flex
project with the following settings (this is from memory, so
hopefully i recall all the steps right):
File->New Flex Project
How will you flex application access data? -> Flex Data
Services -> Compile application locally in Flex Builder
root folder: C:\fds2\jrun4\servers\default\samples
root url:
http://localhost:8700/samples/
I left the build paths to their defaults (ie. blank)
named the main application file: tutorial1.mxml
output folder: tutorial
output folder url: [blank]
So I followed his advice, deleted my project and made it in
the samples directory and the notes application started to work, so
all appeared good.
Until I was making the Java example and I'm now faced with
the following errors:
Severity Description Resource In Folder Location Creation
Time Id
2 Definition samples.contact:Contact could not be found.
tutorial2.mxml Tutorials line 8 6 juni 2006 16:36:25 20
Severity Description Resource In Folder Location Creation
Time Id
2 Type was not found or was not a compile-time constant:
Contact. tutorial2.mxml Tutorials line 13 6 juni 2006 16:36:25 21
I'm not sure what is happening, I think it can't find the
data service files but I don't know how to make it so that it does
find those files, the manual is still a bit vague on that and when
I try to run this application it just shows a blue (flex) screen.
The code I'm using is straight from the example documentation after
tinkering around for 2 days to get this to work I thought using
original code is the best way to go.
My application tries to run from
http://localhost:8700/samples/bin/tutorial2.html
The documentation states:
Open the following URL in two browser windows:
[L=http://localhost:port/samples/tutorials/tutorial2.mxml
The screenshots in the documentation show:
Window1: /tutorial/tutorial1.mxml
Window2: /dataservice/contact/tutorial_step4.mxml
Now I'm just confused, tutorial1.mxml is my notes application
which was the previous tutorial and there was no mention of a
tutorial_step4.mxml file ?
Someone help me please with where to place these files and
why so I can understand what is happening here.
Thanks :) -
Auditing SAP Application - users lists
Hello everybody,
i am trying to find a way to report a list of users that have the following authorizations:
u2022 access to administrative tasks
u2022 access to batch input administration
u2022 access to monitoring tools and logs
u2022 access to promote to production
u2022 access to table access
u2022 access to user management
It is obvious according to SOx compliance (security purposes) that this access should be restricted to a group of users. So i have to retrieve the data of SAP application to solve my wonder.
could anyone tell me what should i do? which authorization objects and values should i use to create each report?
Thanx in advance,
SHi,
Check the report RSUSR002 or SUIM (transaction code) for selecting the users based on various parameters
regards
Naveen kumar -
Security requirements to upgrade Master Data Services Database
What is the security requirements to upgrade an MDS database. When I choose Upgrade Database and after running the upgrade scripts I get the following exception:
Microsoft.MasterDataServices.Configuration.ConfigurationException: The user does not have access to the application.Hi RicardoMarques182,
Did the error happen right after the upgrade was done and was trying the open the Master Data Service(MDS) application? Could you please help to post the full(more detailed) error message?
Just per the general error message, please ensure the currnt login user has the at least Explorer function permission:
http://msdn.microsoft.com/en-us/library/ff487017.aspx
Thanks,
Jinchun Chen -
Complex data type to application service create method(CRUD)(Urgent)
Hi Experts,
I have created an entity service with remote persistency(Web service).i am mapping the entity service CRUD methodes to web service methodes and i am calling these CRUD methodes form the corresponding applicaion service CRUD methodes.With out complex data types both sevices are working fine.I have tested them in the service browser.
Later I have created a complex data type in the entity service and mapped it to web service.But when i am creating the application service create(CRUD) method for this entity service the complex data type which i have created in the entity service is not being shown as possible input attributes of the application service create method.Due to this i am not able to pass this complex attribute form the application service create method to the entity service create method.
Plz tell me how can i resilve this issue.(Urgent)
Thanks
Sampath.GHi Sampath,
Please check SAP note 1004108. I think the issue you describe is one of the limitations described in this note.
Regards,
Jan -
Data Services XI3.1 function module files for SAP R/3 is not working
Hi guys,
Thank you for the quick response so far. I am very grateful to you all.
I got an issue and will try to explain as detail as I could and
hope ur guys won't mind. :P
I pass the functions module to SAP tech guy to install to SAP using CTS
method, and using 900086.R63 file type because my chinese client is
running SAP on unicode environment.
My DS installed on my laptop client; job server plus db2 is on HP
unix. SAP is on another HP Unix server. The function module I used is
supplied from Data Services XI3.1.
Anyway, the installation is successfully done with GUI wizard windows of
SAP Workbench instead of tp command line method.
1)However, I checked the ZAW0 function group, I opened and can't see any functions
listed under it, although the table structures ZACTA, ZTAB2048 and other items
were created.
2)So I tried to create SAP Datastore, Transport Target, etc and tried to
extract file. It failed and log said "can't open file ---
/db2/temp/curcode.txt". I checked and SAP working directory /db2/temp didn't have
curcode.txt, because the result file supposed to be extracted and stored there.
The SAP tech guy checked SAP and cant find generated ABAP program of
ZCURCD is running anyway, although my local CURCD ABAP program is
created.
3)Is it the 900086.R63 not complete? Is my function installation correct? I could view data
on datastore, but just cant run job to extract. Another non-unidcode 900200.S08 group
has larger file sizes. Should I try to install also this 900200.S08 files? Will it corrupt SAP
and cause system errors?
4)After that, I had tried to install the functions manually into ZAW0
one by one using cut and paste, however after that, I cant view the data view anymore with
error. Besides, I tried to run Check on the functions and they all returned
syntax errors, so I cant Activate any of them with SAP. Maybe dependency is missing??
Then, I deleted the manually installed functions and can view data again, but just cant extract
data by job.
5)Another side issue is all these SAP datastore creation and viewing is
through DI 11.5 old installation. The new Data Services X3.1 unable to
create datastore with database error, although the function module
installed is came from Data Services. Does anyone know the issue?
Thank you very much to read it patiently. :PHi,
Thank you for the informative link. It did help a lot in solving the problem.
(1) to (4) --- The Sap tech guy didnt install the function programs correctly, and he redo the transport again using CTS. Now I can see all the functions inside the function group ZAW0. While for the problem of cannot generate extracted file to SAP working directory for downloading, it is due to unix directory access permission setting. And also have to include the ftp username into SAPSYS user group as the help link suggested.
(5) For this problem, I reinstalled again the DS with all required components like Server Manager. The network technical guy helped me with the first installation, and the installation was not complete. So now the issue is solved, I can create R/3 Datastore with DS. -
Data Services communication with SAP - physical/logical server name
Iu2019m having trouble connecting Data Services to SAP.
Environment details:
Data Services version 12.2.1.2 on Windows 2008 SP2
ERP and SRM on Windows 2008 (database SQL Server 2008 SP1)
Repository on SQL Server 2008 SP1
All servers (SAP and Data Servers) have had logical systems defined over the physical system names and they use these logical names to communicate with each other. The Data Services server, hence has two names (logical and physical) and two corresponding IP addresses.
The problem weu2019re having is that when the SAP server communicates back to the Job Server it needs to use the logical server name and corresponding IP address. Currently it is using the physical server name and IP address, causing communication failures.
When I run the host_name() function within Data Services it returns the physical host name.
Is there a way to determine how Data Services identifies itself and whether this can be manipulated? Is it possible to configure Data Services to run in a logical environment?
Any thoughts or commends would be appreciated. ThanksLooked at this from another angle. We were using Direct Download transport method which processes in the foreground and uses the SAPGUI for communication. It seems as though it was how SAPGUI communicates with the SAP server on Data Services behalf that was causing the problem.
We managed to motivate to get a share setup between the SAP server and the Data Services box which allows us to use Shared Directory transport method. This also allows for background processing - eliminating the part of the SAPGUI communication that was causing the problem.
Maybe you are looking for
-
How to manage more than one iTunes library?
I want to transfer some media from my lap to an ext drive to create more space. I want one iTunes library on the ext drive for some media and keep the main library on my laptop. I created a new iTunes library on my external drive and added one artist
-
How to play movies from Accer DD Hard Drive
I've had my computer for less then 2 weeks and i haven't been able to install all of the major apps in order for certain things to run on my computer. How do i play movies from my external hard drive. I keep having a pop up on saying "QuickTime Playe
-
Can't get file from resource in jar.
hi, I'm using the folowing mechanism to get a configuration file from my classes jar file. URL configUrl = Config.class.getResource("/resources/"+fileName); File ressourceFile = new File(configUrl.toURI());while this is working perfectly when i run m
-
Have FIOS Quantum router up and running - but now can't connect to Tivo or my Aria scale -help!
Getting my router set up was quick and easy, but now I have a problem that has me stumped. I am not a router expert, so need help. Neither my Tivos nor my Aria scale can connect to my router. Tivo sent me this page: http://support.tivo.com/app/answer
-
Can't update iWeb 1.0 to 1.1.1- Strange Error Message
I'm updating iWeb 1.0 to 1.1.1 in order to remove the conflict PC visitors are having to my website: www.ifsongs.com. Those PC visitors get ActiveX control panel warnings everytime they try to listen to an mp3 on my Audio Sample page. So...I download