BCP retract, Tranformation file iquiry
Hello Experts,
We have a BAdi on BPC10 SP17 that sends information from BPC to ECC (Retract). We have everything setup, when we run the data Manager it finishes succesfull, but when checking the ECC report it doesn't show any information.
I am suspecting that it is bc the Cost Centers in BPC are made <prefix><ECC cost center ID>.
Would it be correct to do a transforation file to remove the prefix or it must be done in BW?
If it is a BPC tranformation File.
How can i do it?
Thank you in advance.
Best Regards.
Hi Sebastian,
If you are using a BADI to do the retraction then you don't need a transformation file to remove the prefix, you can do that in your BADI.
Ask your abaper to debug.
Andy
Similar Messages
-
Hi,
I'm having issues generating a BCP XML format file using a fairly unusual column terminator, a double dagger symbol
‡ (alt + 0135) which I need to support.
I'm experiencing this problem with bcp.exe for SQL2008 R2 and SQL2012.
If I run the following command line:
bcp MyDB.TMP.Test_Extract format nul -c -x -f "C:\BCP\format_file_dagger_test.xml" -T -S localhost\SQL2012 -t‡
I end up with a XML format file like so:
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="CharTerm" TERMINATOR="ç" MAX_LENGTH="255" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="2" xsi:type="CharTerm" TERMINATOR="ç" MAX_LENGTH="50"
.. and so on.
You will notice that the TERMINATOR="ç" (Minuscule c-cedilla) is output instead of TERMINATOR="‡". The
ç character, is strangely enough is alt + 135 (and not alt + 0135) so this might more than a coincidence! I know you can specify the codepage but this switch applies the data being imported or extracted and not for the format file
itself (I tried it anyway).
In order to use the XML file to bulk import I manually did a text substitution of 'ç' character for '‡' and then BCP imports '‡' data fine.
This character swap doesn't occur if I generate a non XML format file (the '‡' character is output in the format file correctly) however, this file produces other import errors, which I don't encounter if I use a standard delimiter like a comma. So I have stuck
with the working XML format file which I prefer.
Does anyone know why this is happening? I'm planning to automate the generation the of the XML format file and would like to avoid the additional step of text substitution if possible.
Thank you.Hi Ham09,
According to your description , we do a test and find that the character of the terminator is changed due to the code page of Operation System. When you choose the different time zone in Data and Time bar, and do the same bcp test, you will find it will
export the different TERMINATOR in your XML format file. For example, you can import the character "ç" (alt + 135) in (UTC-12:00)International Date Line West time zone and (UTC+09:00)Osaka, Sapporo, Tokyo time zone, and check if the terminators are different.
By default, the field terminator is the tab character (represented as \t). To represent a paragraph mark, use \r\n.
For more information, there is detail about code page(Windows), you can review the following article.
http://msdn.microsoft.com/en-us/library/windows/desktop/dd317752(v=vs.85).aspx
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
Error validating tranformation file
Hello all, I am trying to load a Data Package in BPC.
Created a flat file and tranformation file.
When validate the tranformation file I am getting the following error:
Validate has successfully completed
ValidateRecords = YES
Task name CONVERT:
No 1 Round:
Command CATEGORY does not start with *
Error: Validate with data file failed
I don't have a logic activated. I don't understand the error, whare I set the command Category?? this is a domension name instead a command.
Any Idea what am I doing wrong??
Thanks
SergioThanks Pravin.
This is extrange I have tried all the combination for Header and Skip options but allways got the error.
I changed the transformation file as follow :
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ;
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
Category =*COL(1)
Entity =*COL(2)
P_ACCT =*COL(3)
P_Activity =*COL(4)
P_DataSrc =*COL(5)
RptCurrency =*COL(6)
Time =*COL(7)
Amount =*COL(8)
*CONVERSION
now the error is different:
Validate has successfully completed
ValidateRecords = YES
Task name CONVERT:
No 1 Round:
An exception with the type CX_SY_STRUCT_COMP_NAME occurred, but was neither handled locally, nor declared in a RAISING clause
Component Name '2008.OCT' of the Component 15 Contains an Invalid Character
Error: Validate with data file failed
any idea??
Thanks
Sergio -
Possible reasons for BCP import failure? [Unable to open BCP host data-file]
Hi,
I have inherited a process where a bunch of tables are being prepared locally with python and puched to a database. It fails and I don't know why, though my first thought is access rights. I have admin access to the database (and server?) using SQL server
authorisation. Here is an example BCP command being churned out by the python script:
bcp "[TEST].[dbo].[ISP_2014_flat]" in "flat_files/ISP_2014_201504011504.csv" -U admin -P XXX! -c -t ^ -S "xxx20039999\DZUHE00999,2311"
SQLState = S1000, NativeError = 0 Error = [Microsoft][SQL Server Native Client 10.0]Unable to open BCP host data-file
What can I check that might be responsible for this problem?
Thanks,
AndrewBelow are points that you can consider:
I don't think you need double quotes (") around the table and filename.
If the file is on shared folder, provide the UNC path like
\\server\folder\file.csv and even if it is in local try providing full path
Make sure the file is not opened by someone (if it exists already)
If it works from SSMS and doesn't work if run via SQL Agent then Agent account must have access to the file
You can see few examples given in BCP Utility and also you may give try using BULK INSERT too
bcp Utility
BULK INSERT (Transact-SQL)
Cheers,
Vaibhav Chaudhari
[MCTS],
[MCP] -
I wonder how to bcp a txt file into sybase, what is the method I need and syntax. i know it is poss using dos commands but not sure what to do in java program. can anyone please help? many thanks
Dave,thanks for the suggestion. I am having a lot of trouble , and if that code works for you, then it must be my connection string.
I am trying
import java.io.*;
public class rabbit5{
public static void main (String[] args) {
String bcp_command ="bcp [maindatabase].[databaseowner].[table-to-populate] in [file - in] -I -Sserver -Uuser -Ppassword >stdout_ylogfile 2>std_error_file";
try{
String[]cmd = {"cmd.exe","/C",bcp_command };
Process p = Runtime.getRuntime().exec(cmd);
p.waitFor();
catch(Exception i){
System.out.println("this has not bcp'd");
But I get a variety of errors; from not being able to find my database, to not being able to find my server - in that order as well!! All to do with my bcp_command string
You have been very helpful, but I think I need an example of a working string. please??
many thanks
Kevin -
Preview transformation file in data manager package
Dear BPC Experts,
When we try to preview Tranformation file while running data manager package to import transaction data from BW, we are getting following error. We do not get this error if we use load from flat file package.
We are on BPC 10 PS06, EPM Add-in SP14 patch3.
Has anybody seen this issue before? We can paste the entire log if required.
See the end of this message for details on invoking
just-in-time (JIT) debugging instead of this dialog box.
************** Exception Text **************
System.ArgumentException: Separator cannot be null and must contain only one char
Parameter name: separator
at FPMXLClient.DataManager.CsvParser.Parse(String data, String separator, Boolean hasHeader) in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\DataManager\CsvParser.cs:line 15
at FPMXLClient.DataManager.UI.Forms.FilePreview.BuildDataArrayFromCsv(String data) in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 487
at FPMXLClient.DataManager.UI.Forms.FilePreview.BuildDataArray(String data, Boolean formatted) in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 414
at FPMXLClient.DataManager.UI.Forms.FilePreview.SpecialFilesProcessing() in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 406
at FPMXLClient.DataManager.UI.Forms.FilePreview.DisplayData() in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 351
at FPMXLClient.DataManager.UI.Forms.FilePreview.InitializePreview() in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 102
at FPMXLClient.DataManager.UI.Forms.FilePreview.FilePreview_Load(Object sender, EventArgs e) in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\DataManagerUI\Forms\FilePreview.cs:line 740
at System.Windows.Forms.Form.OnLoad(EventArgs e)
at FPMXLClient.UILayer.Forms.BaseForm.OnLoad(EventArgs e) in d:\Olympus_100_REL_XLCLIENT\src\FPMXLClient\src\UILayer\UI\Forms\Base\BaseForm.cs:line 70
at System.Windows.Forms.Form.OnCreateControl()
at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible)
at System.Windows.Forms.Control.CreateControl()
at System.Windows.Forms.Control.WmShowWindow(Message& m)
at System.Windows.Forms.Control.WndProc(Message& m)
at System.Windows.Forms.ScrollableControl.WndProc(Message& m)
at System.Windows.Forms.Form.WmShowWindow(Message& m)
at System.Windows.Forms.Form.WndProc(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
Best Regards,
Ashwin.Hi Raju,
Thank you for your reply.
It seems that it is SP related issue. When we downgrade our EPM Add-in to SP13 patch4, it did not throw any error.
Best Regards,
Ashwin. -
Display a PDF file from local drive
Hi,
I would like to display a pdf file that is actually stored on the portal server hard drive.
What I have tried to do is link an iFrame element to the url of the file: "D:
MyFolder
myFile.pdf"
But it doesn't work. I have the following error when trying to do that:
com.sap.tc.webdynpro.services.exceptions.InvalidUrlRuntimeException: Invalid URL=D:/MyFolder/myFile.pdf
Does this means that iFrame can only display http URLs ? What are the other ways to easily display pdf files ?? I'am stuck with that problem and can't find any other solution.
I also tried with "servletResponse.getOutputStream().write(fileByteContent);" but the problem is that I am not in a PAR application, and I don't have the servletResponse element and don't know how to get it.
Thanks for your help.
Thibault SchalckHi,
You can not access the .pdf file using the nornal path (c:
), use the following code to open the pdf in ur iframe, this will work i used it in my application.
String sFileName =
strCampCodeVal"_"nReqIdVal+".pdf";
String sFile =
"C:
SBLI
BCP
barcode_files
"+sFileName;
File fFile = new File(sFile);
if ( fFile == null )
System.out.println("System can not download the
file at this time. Please try again later.");
return;
//Checking the file for existence
if ( !fFile.exists() || !fFile.canRead() )
System.out.println("You have specified an
invalid file to download. Please check and try
again.");
return;
//set the content type its important , try with applicatio/pdf also
res.setContentType("application/force-download");
//seting the header
res.setHeader("Content-disposition",
"attachment;filename=" + sFileName);
res.setHeader("Cache-control", "must-revalidate");
ServletOutputStream sosCur = res.getOutputStream();;
//reading the in and writing the stream
BufferedInputStream bisFile = null;
try
byte [] bBuffer = new byte[4096];
bisFile = new BufferedInputStream(new
FileInputStream(fFile));
int nBytes = -1;
while( (nBytes = bisFile.read(bBuffer, 0, 4096))!=-1 )
sosCur.write(bBuffer, 0, nBytes);
catch(Exception ex){
This will resove your issue.All the best..
Regards..
krishna.. -
A question for transformation/conversion file
Hi Guru:
I load data from a file for a dimension.
For one member, I want to set the value to blank if the source value is equal to "1000".
How to write the conversion/tranformation file to achieve this?
Thanks.
EricHi,
What Nilanjan said is correct. You cannot have any member as blank in BPC.
I can suggest you a workaround, have a member BLANK in your dimension with no description and maintain the conversion file to map the source value 1000 to this member.
So when you report on this, end user is going to see only the description(if you hide the key values) which is going to be empty without any value for it. So they have the feel of no ID is assigned to it.
Hope this helps,
Regards,
G.Vijaya Kumar -
Hi,
Can any of you explain me the purpose of Data Transformation File ???
How to use Script logic based on BADI in SAP BPC? Do we need to call in Default.lgf ???
If i want to apply some custom logic while populating data using BADI, do i need to call that BADI in Data Tranformation file?
Thanks,
Ben.Hi Ben,
There is no possibility of calling userdefined scripts or BADI from transformation files.The main use of transformation file is to map data from source to destination.
If you require data load based on some filter criteria there are certain functionalities available for transformation file creation which u can find in the below link.
http://help.sap.com/saphelp_bpc70sp02/helpdata/en/66/ac5f7e0e174c848b0ecffe5a1d7730/frameset.htm
Hope this helps,
Regards,
G.Vijaya Kumar -
I'm trying to export from bcp to a Windows Named Pipe. Is this supported?
I'm getting the error below when running this export...
"bcp" [SQL_Class].[dbo].[Customer_table] out "\\.\pipe\testpipe" -S 12.12.122.12,12121 -U sa -e "err.txt" -o "out.txt" -w
I'm able to log on to SQL Server, but I'm getting this error...
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 11 for SQL Server]Unable to open BCP host data-file
DimpipeServer
AsNewNamedPipeServerStream("testpipe",
PipeDirection.InOut, 4,
PipeTransmissionMode.Byte,
PipeOptions.Asynchronous, 131072,
131072)
pipeServer.WaitForConnection()
I've been able to start a reader which is Teradata Fastload and it is able to open the pipe and is waiting for the read from bcp, but I keep getting this error. Is it looking for the pipe on the server instead of locally? Is this a permission
error. I haven't found too much with regards to using bcp and a Windows Named Pipe so I'm just wondering if this is possible? Thanks.Hi coffingdw,
I agree with you. There should not be an issue with BCP. You can use following command to export the data to a txt file.
BCP MyAdventureWorks.Person.Person out "c:\out.txt" -S servername -T -c
As the description, I know you are using Teradata FastLoad. You can refer to the following thread regarding sending data to Windows Named Pipe in the Teradata website forum.
http://forums.teradata.com/forum/tools/sending-data-to-windows-named-pipe
Best Regards,
Tracy
Tracy Cai
TechNet Community Support -
Positive/negative account conversion file
Hi experts,
I am trying to create a conversion files, in which depending on the sign of value of some account, It convert to one or another account.
I have tried writing in INTERNAL column this formula.
IF(VALUE>0;A100;L100)
also adding the following line to option range in tranformation file:
CONVERT_INTERNAL=NO
However value>0 condition never makes true, so although in files value is >0, data is always converted to false value (L100).
Any idea?
Regardshi,
SQL> select abs(-10) from dual;
ABS(-10)
10
abs gives absolute value.Edited by: user291283 on Aug 31, 2009 11:52 PM -
How can I debug a Bulk Insert error?
I'm loading a bunch of files into SQL server. All work fine, but one keeps erroring out on me. All files should be exactly the same in structure, but they have different dates, and other different financial metrics, but the structure and field
names should be exactly the same. Nevertheless, one keeps konking out, and throwing this error.
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
The ROWTERMINATOR should be CRLF, and when you look at it in Notepad++ that's what it looks like, but it must be something else, because I keep getting errors here. I tried the good old: ROWTERMINATOR='0x0a'
That works on all files, but one, so there's something funky going on here, and I need to see what SQL Server is really doing.
Is there some way to print out a log, or look at a log somewhere?
Thanks!!
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.The first thing to try is to see if BCP likes the file. BCP and BULK INSERT adhere to the same spec, but they are different implementations, but there are subtle differences.
There is an ERRORFILE option, but it more helps when there is bad data.
You can also use the BATCHSIZE option to see how many records in the file it swallows, before things go bad. FIRSTROW and LASTROW can also help.
All in all, it can be quite tedious find that single row where things are different - and where BULK INSERT loses sync entirely. Keep in mind that it reads fields on by one, and it there is one field terminator to few on a line, it will consume the line
feed at the end of the line as data.
Erland Sommarskog, SQL Server MVP, [email protected] -
Sp_addscriptexec is not working with FTP Snapshot delivery
Hi,
We have a SQL Server Merge Replication topology. Both Publisher a Subscribers are using SQL Server 2012.
As I have seen in http://msdn.microsoft.com/en-us/library/ms174360(v=sql.120).aspx
"Using sp_addscriptexec to
post a script file for publications using FTP for snapshot delivery is only supported for Microsoft SQL Server Subscribers."
So we should be OK...
The problem is that the Merge Agent is looking for the Script in the local PC instead of download it form the
FTP Server.
Here is the Code at the publisher:
exec sp_addscriptexec 'IPoint_Pub','C:\IPoint_Files\ReplicationScripts\User\reIndex.sql'
This is how the subscription was created:
use IPoint
exec sp_addmergepullsubscription @publisher = @Central_Server_Name, @publication = N'IPoint_Pub', @publisher_db = @MGR_DB, @subscriber_type = N'Local', @subscription_priority = 0, @description = N'', @sync_type = N'Automatic'
exec sp_addmergepullsubscription_agent
@publisher = @Central_Server_Name,
@publisher_db = @MGR_DB,
@publication = N'IPoint_Pub',
@distributor = @Central_Server_Name,
@distributor_security_mode = 0, @distributor_login = 'XXX', @distributor_password = 'XXX',
@enabled_for_syncmgr = N'True',
@frequency_type = 4, @frequency_interval = 1, @frequency_relative_interval = 1, @frequency_recurrence_factor = 1, @frequency_subday = 4, @frequency_subday_interval = 3, @active_start_time_of_day = 0, @active_end_time_of_day = 235959, @active_start_date = 20100527, @active_end_date = 99991231,
@alt_snapshot_folder = N'', @working_directory = N'',
@use_ftp = N'True',
@job_login = null, @job_password = null,
@publisher_security_mode = 0, @publisher_login = 'XXX', @publisher_password = 'XXX',
@use_interactive_resolver = N'False', @dynamic_snapshot_location = null, @use_web_sync = 0,
@hostname =@SUCURSAL_ID
GO
And the log of the REPLMERG.EX
2014-03-29 13:01:49.009 Microsoft SQL Server Merge Agent 11.0.2100.60
2014-03-29 13:01:49.009 Copyright (c) 2008 Microsoft Corporation
2014-03-29 13:01:49.009
2014-03-29 13:01:49.009 The timestamps prepended to the output lines are expressed in terms of UTC time.
2014-03-29 13:01:49.009 User-specified agent parameter values:
-Publication Netul_Pub
-Publisher SRV01
-Subscriber NETUL-01\SQLEXPRESS
-Distributor SRV01
-PublisherDB IPoint_Netul
-SubscriberDB IPoint
-SubscriptionType 1
-ParallelUploadDownload 1
-DistributorLogin sucursal
-DistributorPassword **********
-DistributorSecurityMode 0
-PublisherLogin sucursal
-PublisherPassword **********
-PublisherSecurityMode 0
-SubscriberSecurityMode 1
-OutputVerboseLevel 1
-Validate 0
2014-03-29 13:01:49.019 Connecting to Subscriber 'NETUL-01\SQLEXPRESS'
2014-03-29 13:01:50.654 Connecting to Distributor 'SRV01'
2014-03-29 13:01:51.599 Initializing
2014-03-29 13:01:51.604 Validating publisher
2014-03-29 13:01:51.614 Connecting to Publisher 'SRV01'
2014-03-29 13:01:51.769 Retrieving publication information
2014-03-29 13:01:51.774 Retrieving subscription information.
2014-03-29 13:01:52.294 Connecting to Subscriber 'NETUL-01\SQLEXPRESS'
2014-03-29 13:01:52.304 Connecting to Distributor 'SRV01'
2014-03-29 13:01:52.414 Initializing
2014-03-29 13:01:52.429 Validating publisher
2014-03-29 13:01:52.439 Connecting to Publisher 'SRV01'
2014-03-29 13:01:52.604 Retrieving publication information
2014-03-29 13:01:52.619 Retrieving subscription information.
2014-03-29 13:01:53.274 [29%] [0 sec remaining] Snapshot files will be downloaded via ftp
2014-03-29 13:01:53.284 [29%] [0 sec remaining] Snapshot will be applied from a compressed cabinet file
2014-03-29 13:01:53.294 [29%] [0 sec remaining] Connecting to ftp site 'SRV01.real2b.com'
2014-03-29 13:01:55.019 [33%] [2 sec remaining] Extracting snapshot file 'Documento_Formc3f5d2f9_190.sch' from cabinet file
2014-03-29 13:01:55.064 [33%] [2 sec remaining] Extracted file 'Documento_Formc3f5d2f9_190.sch'
2014-03-29 13:01:57.299 [33%] [2 sec remaining] Applied script 'Documento_Formc3f5d2f9_190.sch'
2014-03-29 13:01:57.304 [33%] [2 sec remaining] Preparing table 'Documento_Formulario_Otros' for merge replication
2014-03-29 13:02:02.574 [51%] [8 sec remaining] Extracting snapshot file 'sysmergesubsetfilters_Documento_Formulario_Otros90.bcp' from cabinet file
2014-03-29 13:02:02.594 [51%] [8 sec remaining] Extracted file 'sysmergesubsetfilters_Documento_Formulario_Otros90.bcp'
2014-03-29 13:02:02.604 [55%] [7 sec remaining] Bulk copying data into table 'sysmergesubsetfilters'
2014-03-29 13:02:02.609 [55%] [7 sec remaining] Bulk copied data into table 'sysmergesubsetfilters' (0 rows)
2014-03-29 13:02:02.619 [55%] [7 sec remaining] Extracting snapshot file 'Documento_Formc3f5d2f9_190.dri' from cabinet file
2014-03-29 13:02:02.624 [55%] [7 sec remaining] Extracted file 'Documento_Formc3f5d2f9_190.dri'
2014-03-29 13:02:02.864 [55%] [7 sec remaining] Applied script 'Documento_Formc3f5d2f9_190.dri'
2014-03-29 13:02:02.874 [59%] [6 sec remaining] Extracting snapshot file 'Documento_Formc3f5d2f9_190.trg' from cabinet file
2014-03-29 13:02:02.884 [59%] [6 sec remaining] Extracted file 'Documento_Formc3f5d2f9_190.trg'
2014-03-29 13:02:02.889 [62%] [5 sec remaining] Applied script 'Documento_Formc3f5d2f9_190.trg'
2014-03-29 13:02:02.899 [62%] [5 sec remaining] Extracting snapshot file 'Documento_Formc3f5d2f9_190.prc' from cabinet file
2014-03-29 13:02:02.919 [62%] [5 sec remaining] Extracted file 'Documento_Formc3f5d2f9_190.prc'
2014-03-29 13:02:03.819 [62%] [5 sec remaining] Applied script 'Documento_Formc3f5d2f9_190.prc'
2014-03-29 13:02:09.334 [74%] [5 sec remaining] Launching sqlcmd to apply the script 'Create_Categorias_Tables.sql'
2014-03-29 13:02:09.974 [74%] [5 sec remaining] Applied script 'Create_Categorias_Tables.sql'
2014-03-29 13:02:09.979 [77%] [4 sec remaining] Launching sqlcmd to apply the script 'Create_Categorias_Tables.sql'
2014-03-29 13:02:10.144 [77%] [4 sec remaining] Applied script 'Create_Categorias_Tables.sql'
2014-03-29 13:02:10.174 [81%] [3 sec remaining] Launching sqlcmd to apply the script 'reIndex.sql'
2014-03-29 13:02:10.189 [81%] [3 sec remaining] Last 115 characters in 'sqlcmd' output buffer: Sqlcmd: 'C:\inetpub\ftproot\ReplData\ftp\SRV01_IPOINT_NETUL_NETUL_PUB\UserScripts\reIndex.sql': Invalid filename.
2014-03-29 13:02:10.194 [81%] [3 sec remaining] Failed to apply the script 'reIndex.sql' using the 'sqlcmd' utility.
2014-03-29 13:02:10.204 The schema script 'reIndex.sql' could not be propagated to the subscriber.
2014-03-29 13:02:10.254 Category:NULL
Source: Merge Replication Provider
Number: -2147201001
Message: The schema script 'reIndex.sql' could not be propagated to the subscriber.
2014-03-29 13:02:10.259 [100%] The process was successfully stopped.
2014-03-29 13:02:10.299 Category:NULL
Source: Merge Replication Provider
Number: -2147200963
Message: The process was successfully stopped.
As you can see, the FTP snapshot is working fine
"Extracted Documento_Formc3f5d2f9_190.trg"
But the it is looking for a file that only exists in the server.
'C:\inetpub\ftproot\ReplData\ftp\SRV01_IPOINT_NETUL_NETUL_PUB\UserScripts\reIndex.sql'
Do you have any idea?
Best Regards, Daniel.Hi again Brandon,
I understand what you are saying and I know it would solve the problem, but it's now our case.<o:p></o:p>
We have other installations with Merge Replication using a shared path (UNC) to download the initial snapshot and every user script is downloaded also from
de shared location fine.<o:p></o:p>
In this case,
we have no direct access to a shared folder in server. The full replication process is done by internet without VPN.<o:p></o:p>
I think the “Snapshot folder” it's well configured as the snapshot it’s being downloaded OK from the
FPT Server.<o:p></o:p>
My question is: Why the merge agent connects to the FPT server for download de Snapshot but it does not do the same to download the user script??<o:p></o:p>
May be it’s a bug in the Merge Agent or some parameter I am not passing?<o:p></o:p>
This is the merge agent invocation:<o:p></o:p>
REPLMERG.EXE -Publication %Publication% -Publisher %Publisher% -Subscriber %Subscriber% -Distributor %Publisher% -PublisherDB %PublicationDB% -SubscriberDB %SubscriptionDB% -SubscriptionType 1 -ParallelUploadDownload 1 -DistributorLogin %user% -DistributorPassword %password% -DistributorSecurityMode 0 -PublisherLogin %user% -PublisherPassword %password% -PublisherSecurityMode 0 -SubscriberLogin %user% -SubscriberPassword %password% -SubscriberSecurityMode 1 -OutputVerboseLevel 1 -Validate 0
Best Regards, Daniel. -
Hi All,
We have a client with one CAS & 4 primary sites (All the sites are SCCM 2012 R2) because of some maintenance work one of the primary site was down for few hours & now in site hierarchy the link shows as failed. when we hover over the link it
shows that the Global data replication between the CAS & the Primary site is failing.
What we have tried :-
1. Run the replication link analyzer - Its unable to resolve the issue & just shows the names of replication groups which are failing.
2. Run the SPdiagdrs command on primary site DB - This shows the Global replication groups which are failing & are not active.
3. Run the Update RCM_DRSInitilazationTracking set initializationstatus=7 where ReplicationGroup = "Group Name" - To reinitialize the replication.
4. configuration data.pub => Drop these on Primary server RCM.BOX - As mentioned
here. This solved the issue initially but the replication was failed again after some time.
5. rcmctrl.log on CAS shows this error :-
Calling BCP out with SELECT ID, SYSRESUSEID, SITENUMBER, NAME, VALUE~FROM SC_SysResUse_PropertyList, C:\Program Files\Microsoft Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\SC_SysResUse_PropertyList.bcp, C:\Program Files\Microsoft
Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\bcpErrors.errors.
Calling BCP out with SELECT ID, SITENUMBER, USERNAME, PASSWORD, AVAILABILITY~FROM SC_UserAccount, C:\Program Files\Microsoft Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\SC_UserAccount.bcp, C:\Program Files\Microsoft Configuration
Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\bcpErrors.errors.
Calling BCP out with SELECT ID, USERACCOUNTID, SITENUMBER, NAME, VALUE1, VALUE2, VALUE3~FROM SC_UserAccount_Property, C:\Program Files\Microsoft Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\SC_UserAccount_Property.bcp, C:\Program
Files\Microsoft Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\bcpErrors.errors.
Calling BCP out with SELECT ID, USERACCOUNTID, SITENUMBER, NAME, VALUE~FROM SC_UserAccount_PropertyList, C:\Program Files\Microsoft Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\SC_UserAccount_PropertyList.bcp, C:\Program Files\Microsoft
Configuration Manager\inboxes\rcm.box\40075d72-d82a-4e1e-9944-2afcadf5813f\bcpErrors.errors.
Need help on this, I can see the error's coming but need suggestions for troubleshooting this issue.
Thanks,
Pranay.Hi,
Any update? Could you find any helpful information in the rcmctrl.log?
Best Regards,
Joyce
We
are trying to better understand customer views on social support experience, so your participation in this
interview project would be greatly appreciated if you have time.
Thanks for helping make community forums a great place. -
Help! Directory compiled path too long for Workshop
Hi,
I need some help in confirming the following behaviour. And possibly rectifying it.
I created a project within workshop that has a very long folder structure. Some of the folder names exceed 20 characters and there could be multiple levels of such folders. I created a process (.jpd) and added a control send with return node that performed a transformation on the variables being passed into the control. This created a dtf and xq file with names based on the process diagram and control ( i think ). Regardless the created tranformation files are long, really long. WHen i attempt to build the project i get :
Error: Could not write to 'C:\CVS\eai\src\workshopApps\eai\.....\SomeClass.class (The system cannot find the path specified)"
I have determined that the total path length ( including c:\ and all \ inbetween, and the .class suffix) to be 259 charachters long. Anything longer will fail, i.e. 260 will fail, 259, 258, etc, will compile perfectly.
Does this break some rule within the JVM? or within Workshop?
Is there some switch within workshop that will allow extra long directories?
It also seems that minus the 'c:\' within the path the max number becomes 256. Could workshop be using a char to store the path? Hmm ... interesting.
Here is the exact error message i am getting with full path info. It is from a sample/test project i created to replicate the issue.
Error: Could not write to 'C:\CVS\eai\src\workshopApps\eai\.workshop\output\_Test_Project\WEB-INF\classes\aaaaaaaaaa\bbbbbbbbbb\cccccccccc\dddddddddd\eeeeeeeeee\ffffffffff\gggggggggg\hhhhhhhhhh\iiiiiiiiii\jjjjjjjjjj\kkkkkkkkkk\llllllllll\mmmmmmmmmm\nnnnnnnnnn\oooooooooo\abcdefghij.class': C:\CVS\eai\src\workshopApps\eai\.workshop\output\_Test_Project\WEB-INF\classes\aaaaaaaaaa\bbbbbbbbbb\cccccccccc\dddddddddd\eeeeeeeeee\ffffffffff\gggggggggg\hhhhhhhhhh\iiiiiiiiii\jjjjjjjjjj\kkkkkkkkkk\llllllllll\mmmmmmmmmm\nnnnnnnnnn\oooooooooo\abcdefghij.class (The system cannot find the path specified)Yes,
This is a system limitation. If you come across this issue with regards to Schema projects where the namespace values causes the length to be hit, then try using xsdconfig to map the namespace value to package structure.
Hope this helps.
Raj
Maybe you are looking for
-
i am trying to use my account to purchase but i forget my security questions and also i gave the wrong email address. What can i do?
-
Down load to excel issue in obiee 11g
Hi all, after i download to excel and open the file it is prompting as below.is there any way to get rid off the prompting,any modifications required to achieve this. The file you are trying to open,'Balance_sheet_Hier[1].xls',is in a different forma
-
Export settings to print images
Normally I export my images and then upload them to the web and thats it but today I needed to actually make some prints and I exported a 7 inch by 7 inch images and saved them onto a card then when to the local camera shop to print and what I found
-
VMI Enabled / Consigned From Supplier
Hello All, I would like to use encumbarence feature in Vision environment, When I am trying to activate the encumbarance option it is asking be to uncheck the supplier items which are VMI Enabled or Consigned. Now When am trying to uncheck the VMI En
-
Cannot see the elements in Web Service Action block
Hi, When I use SoapUI I can see the elements of a Service but when I use Web Service Action block of MII, I can only see the first element (job). The children of job are not appearing in link editor. Any Idea? <soapenv:Envelope xmlns:soapenv="http://