Error in Data Package Load
Hi,
We have a generic datasource that captures a table from R3.
3 out of 4 fields get transferred without problem.
I have made a new Characteristic to map the fourth one in my transfer rules.
I get the error
" Value "xyz" (hex. ...) of characteristic ZPS_PROJ contains invalid characters
I checked the R3 field, it contains no invalid chars, it comes into PSA also without errors.
Kindly help
Pranab
Hi,
Check the info object ZPS_PROJ . Is it capable to have the values in lower case. If not ,either use TOUPPER formula in transfer rules or make Loer case is possible for the Info object.
With rgds,
Anil Kumar Sharma .P
Similar Messages
-
Error Caller 09 contains error message - Data Marts loading(cube to ODS)
Dear all,
Please ! Help me in this problem, This is very urgent.
I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
Thanks,
Pankaj N. Kude
Edited by: Pankaj Kude on Jul 23, 2008 8:33 AMHi Friends !
I didn't find any short dump for that in ST22.
Actually , What happens is, Request continues to run in background for infinite time. At that time
Status Tab in Process Monitor shows this messages :
Request still running
Diagnosis
No errors found. The current process has probably not finished yet.
System Response
The ALE inbox of BI is identical to the ALE outbox of the source system
or
the maximum wait time for this request has not yet been exceeded
or
the background job has not yet finished in the source system.
Current status
in the source system
And Details Tab shows following Messages :
Overall Status : Missing Messages or warnings
Requests (Messages) : Everything OK
Data Request arranged
Confirmed with : OK
Extraction(Messages) ; missing messages
Data request received
Data selection scheduled
Missing message : Number of Sent Records
Missing message : selection completed
Transfer (IDOCS and TRFC) : Everything OK
Info Idoc1 : Application Document Posted
Info Idoc2 : Application Document Posted
Processing (data packet) : No data
This Process runs for infinite time, then I have to kill that process from server, & Then It shows Caller 09 Error in Status Tab
Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
Please, give your suggestion as early as possible, I m ewaiting for your reply.
Thanks,
Pankaj N. Kude -
Hi Everybody,
I got Error when i uploaded the text file from the data manager through datafiles, it shows the green mark in( package status window) and the data was entered into the SQL database but the problem is when i uploaded the data file next tim it is showing the error as below.
System is not available. The system is unavailable due to a Data Management system package in process. Package Name= Append into Fact Table UserID= SAP-CPMVM\CPMadmin.
Can any body help in this matter.
Thanks & Regards
Chandra Mohan.After the first run of package did you put the application available.
Append package is processing the cube and it is making the appset unavailable.
Please open admin module and put the appset available. Do upload and you will see you will not receive anymore the error.
Regards
Sorin Radulescu -
Update Rule Error ( Invalid Data) while load 0VENDOR
Hello All,
I have attributes like 0POSTAL_CD of 10 Char (Postal Code) and 0SORTL 0f 10 Char (Sort Field) of Vendor. From yesterday onwards the Production master data loads for 0VENDOR started failing saying:
Record 211 :0POSTALCODE : Data record 211 ('03 2VEN4000040 '): Version 'HK HONG KONG ' is not valid
Record 136 :0SORTL : Data record 136 ('0002000175 '): Version 'SATTLER, I ' is not valid
Solutions tried:
Interstingly what we found is: when u edit the PSA data for the above records and again input the same values, the load runs fine.
I found this thread some what relating to my problem, but I could'nt get the solution.
Please help to solve this issue which I am facing alone....... Thanks for your supportHi Ram,
Really add more valid char is a possible option, but in your case I dont believe that was a good idea...
I believe that this msg is an automatic master data validation.
Ex:
0SORT: is not valid use comma ( , ) or sapce in this field
0POSTALCODE: the value 'HK HONG KONG ' dont seems to be a correct value for a zip code
I belive that this validation is not about allowed extra-char
Check both InoObject and see wich datatype support.
Hope this Help! -
Error with Data package execution from BPF or running package from custom b
custom button.
1. When run package from BPF or custom button an error apear (for all users): "Subscript out of range".
Package dynamic script
PROMPT(SELECTINPUT,,,,"BUDGETHOLDER,CONTRACTOR,ENTITY,RPTCURRENCY,SCENARIO,VERSION")
TASK(ZMF_BPC_LOGIC_RUN,EQU,%EQU%)
TASK(ZMF_BPC_LOGIC_RUN,SUSER,%USER%)
TASK(ZMF_BPC_LOGIC_RUN,SAPPSET,%APPSET%)
TASK(ZMF_BPC_LOGIC_RUN,SELECTION,%SELECTION%)
TASK(ZMF_BPC_LOGIC_RUN,SAPP,%APP%)
TASK(ZMF_BPC_LOGIC_RUN,LOGICFILENAME,PAY_DEPT.LGF)
2. After click OK such selection screen appear.
3. When run package from eData menu u2013 Run package everything OK. And after running package from BPF or custom button everything become OK. (Only for user who run package from EData menu)
4. When Close BPC Excel and launch new BPC Excel error appear again.
The partners have posted a Message 247960 / 2010
created 23.03.2010 - 14:15:03 CET
Could you please comment on the issue?Hi Usman,
The solution is :
1. Edit the files updateConfigProperties.properties and
prependConfigProperties.properties in the installation directory:
Replace the hostname with the value present during the initial installation.
Note that the installation directory is the directory where SAPinst creates its log files.
The original hostname should be reflected in the names of the profiles created by the installation
(/usr/sap/<SID>/SYS/profile/<SID>_<Instance>_<hostname>).
If the log files of the initial installation are still available the original hostname can also be found in the file jengine.properties in the value of the property box.number. This file is located in the installation directory the initial installation
2. Restart the installation.
This has to work,it worked for me.
P.S: Give points,ok.
Rgs
vikas -
Error In DTP:Data package 1 / XDateX XTimeX Status 'Processed with Errors'
Hi,
While Pushing the data from the DSO(0PP_DS01 - Planned/Actual Comparison Order/Material View) to Cube 0PP_C15 -( Backlogged Production Orders).
Getting the Error as : Data package 1 / XDateX XTimeX Status 'Processed with Errors'
When checked the Long text it shows the same text i.e.,Data package 1 / XDateX XTimeX Status 'Processed with Errors'
It does not show any further error message which i can point out.
If anyone has come across this error please provide the resolution.
Thanks,
Adhvi RaoHi Advirao,
Are you using DTP to load the data to the target? Then you can enable error stack to collect the erroneos records.
Also check if you have activated the request in ur source DSO.
Regards
Dinesh -
RSKC ( Error in data loading)
Hi Expertz!!
Good Morning to Everyone.
Am Troubling with an error which is data loading.. After My data load.
I got an error like this.
Transfer (IDocs and TRFC): Errors occurred
- Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 error.
Processing (data packet): Errors occurred
Data Package 1 ( 7102 Records ) : Errors occurred
- Update ( 0 new / 0 changed ) : Errors occurred
- Data records for package 1 selected in PSA - 1 error(s)
- Record 4044 :
Value<b> 'TING,BFUL-06 = 24hrs Machine"TREATMENT,
' (hex. ' ')</b> of characteristic ZICR(Field name).
But in the value is in between a string <b>'TING,BFUL-06 = 24hrs Machine"TREATMENT</b>, ' (hex. ' ') '..
But the value in PSA is like this... 'TING,BFUL-06 = 24hrs Machine"TREATMENT,
So i wanna know.. Whatz that 'Hex '..
So, I got an information from my friend cut the symbol and paste it in MS-Excel sheet.. U will get the char which is hidden.. So i did This... But Am getting NO value.. But it says that there was ENTER and SPACE.. SO may i know... Which type of char should i execute in RSKC.
Can i know which Char Should i add in the RSKC... And Char For SPACE also..
Kindly Help me...
Answering getz appreciated!!!!!
Cheerz,Hi Aluri,
I too have faced the same problem before. the solution goes on like this.
Possible Cause:
If u could observe there are Capital Alphabets , Small Alphabets numerics also. in these cases u have to make the string 24hrs ... in small to upper case.
Possible solution
go to the transfer structure and identify the field (Info Object) on which the error is occuring.
create a new formula and the definition of formula is like this
TO UPPER(InfoObject). To browse for To upper search in the functions and scroll down.
come back and activate the transfer structure.
reload the data..this time it should go through....
All the Best... -
HI gurus,
I have a problem with de BW upload from R3 system. Till last week everything worked perfectly and automatic, but we added some companys to the system R3 and now, everyday the job is doing problems and we have to launch manually.
All data is transferred correctly to BW but then is not posted on PSA and in the details we have the following error message:
data packaged 1: arrived in BW; processing: 2nd processing step not yet finished
Have someone some idea about it?
THanks in advance.Yes, it's like you wrote. When i execute manually it works ok. The problem is when it is working automatically in background, the daily work.
I don't have any dump (st22) in BW or R3. I checked also WE05, BD87, SM58 and others and everything looks fine.
thanks for the answer.. -
Error during data load due to special characters in source data
Hi Experts,
We are trying to load Billing data into the BW using the billing item datasource. It seems that there are some special characters in the source data. When the record with these characters is encountered, the request turns red and the package is not loaded even into the PSA. The error we get in the monitor is something like
'RECORD 5028: Contents from field **** cannot be converted into type CURR',
where the field **** is a key figure of type currency. We managed to identify the said record in RSA3 on the source system and found that one of the fields contains some invalid (special) characters that show up as squares in RSA3. The data in the rest of the fields, including the fields mentioned in the error looks correct.
Our source system is a non-unicode system wheras the BW system is unicode enabled. I figure that the data in the rest of the fields are getting misaligned due to the presence of the invalid characters in the above field. This was confirmed when we unassigned the field with the special characters from the transfer rules and removed the source field from the transfer structure. After doing this the data was loaded successfully and the request turned green.
Can anyone suggest a way to either filter out such invalid characters from the source data or make some settings in the BW systems such that the special characters are no longer invalid in the BW system? We cannot write code in the transfer rules because the data package does not even come into the PSA. Is there any other method to solve this problem?
Regards,
TedHi Thad,
I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars.
Could this because of some incosistency of data ??
I would like to know for Which currency had some special characters it it in that particular record ??
Hope that helps.
Regards
Mr Kapadia -
"Error in data selection" while loading
hi,
while loading data from one ODS to another(datamart) I, after extraction of some data packages successfully I m getting an error:"Error in data selection"
i saw its help messages were
1)error 7 when sending an idoc
2)sql error in accessing table no 13
3)error in source system
in the source ods i have loaded thru init from R/3..
please helpHi,
Take a look on the cosulting note : 610938.
In place of R/3 system you have to consider BW itself when you are reading the note.
With rgds,
Anil Kumar Sharma .P -
Data Package is not consistent error
Hi All,
We have a Bex issue the screenshot of which is given in the link below. We have an info cube which has huge data volume. We load it very frequently (every hour from 2 sources). This info cube has been loading from say last 5 years. We stopped compressing the data about an year ago for some reason. We are thinking of compressing the data sometime soon. All of a sudden, users are getting the error shown in the link below more frequently. No much information is available at Service Market place about this issue. Can somebody please throw some light on this?
[Error Screenshot|https://photos-1.dropbox.com/i/l/orsd6TKDlIBVoH4JO79GWQAPNFPlTa88QJ0zkV7Lcc0/53945346/1324418400/3799537/Bex%20Error.JPG#1]
Thanks and Regards
Subray HegdeHi Deepak,
I am not able to attach a screenshot. Somebody directed me to servimg.com but, I do not know how to create an account in this site. The error reads like this:
"Error Aggregates of <Infocube> were compressed. Data package is not consistent. Error. Error while reading data; navigation possible.".
Thanks and Regards
Subray Hegde -
Data Package 000001 : sent, not arrived. No data load in BW 7
Hi gurus...i am working with BW 7 and i have already had the complete data flow from SRM to BW, some Business Content extractors and somo others generic extractors. We have created transformations, info packages and DTP's, in fact we have loaded master data and transactional data but since last monday data load is not working any more, now every time we manually run the infopackage the next waring message appears:
Data Package 000001 : sent, not arrived
and the data load just wait for an answer of SRM (with is the source system) or SRM waits a request from BW.
We have reactivated the extractors in the SBIW, replicated and reactivated in BW (also we reactivate the transformations) and the data load is not working. Then we also regenerate the Data source but it does not work.
Do you have and idea of what is happening or which could be the problem.
ThanksCheck in transaction SMQA in the source system. The communication might not be working. If there are any entries in RED, then select execute LUWs from the menu to manually process them.
Read the error and try to resolve it before though. -
Reason for Error in Manual Update of Data Package
Hey,
Sometimes while loading huge amount of data, one of the data package doesnt get uploaded. For those data packages we do a manual update of the data package and it works well.
My question is why do we get this error,
Is it to do with the system contention or any IDOC issue?
Please help me out to find the root cause of this, so that we can avoid the manual update of the data packges.We usually get this issue , when we are doing a lot of processing in the Update Rules. The Data package processing gets timed out.
If it is happening very often then you would need to sit with your Basis person and modify the Timeout settings for your system. -
Clear Data Manager Package Error "The data file is empty."
Hi,
When I run the Clear data package in Data Manager, I receive the error "The data file is empty." I selected a very specific set of dimension values (none are calculated) and am on BPC 7.5 SP3. I subsequently turned on debugging to troubleshoot, but do not see any obvious issues leading the the error message. The log file with debugging turned on is below. Any help would be greatly appreciated!
Thanks.
Tom
TOTAL STEPS 3
1. Export_Zero: completed in 1 sec.
2. Load Cube: Failed in 0 sec.
3. Clear: completed in 0 sec.
[Selection]
ENABLETASK= Yes
CHECKLCK= Yes
(Member Selection)
Category: ACTUAL
Time: 2010.C_SEP
Affiliate: az_swhd
Account: Donor_DART_ID_1
Functional: Benchmark_F
Report: Cons
Restriction: AnyRestricted
[Messages]
The data file is empty. Please check the data file and try again.
[EvModifyScript Detail]
12-28-2010 17:30:05 - Debug turned ON
INFO(%TEMPFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
TASK(EXPORT_ZERO, APPSET, ESMetrics)
TASK(EXPORT_ZERO, APP, CONSOLIDATED)
TASK(EXPORT_ZERO, USER, NESSGROUP\tbardwil)
TASK(EXPORT_ZERO, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
TASK(EXPORT_ZERO, SQL,
select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
TASK(EXPORT_ZERO, DATATRANSFERMODE, 2)
TASK(LOAD CUBE, APPSET, ESMetrics)
TASK(LOAD CUBE, APP, CONSOLIDATED)
TASK(LOAD CUBE, USER, NESSGROUP\tbardwil)
TASK(LOAD CUBE, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
TASK(LOAD CUBE, DATATRANSFERMODE, 4)
TASK(LOAD CUBE, DMMCOPY, 0)
TASK(LOAD CUBE, PKGTYPE, 0)
TASK(LOAD CUBE, CHECKLCK, 1)
TASK(CLEAR COMMENTS, APPSET, ESMetrics)
TASK(CLEAR COMMENTS, APP, CONSOLIDATED)
TASK(CLEAR COMMENTS, USER, NESSGROUP\tbardwil)
TASK(CLEAR COMMENTS, DATATRANSFERMODE, 0)
TASK(CLEAR COMMENTS, SELECTIONORFILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
TASK(CLEAR COMMENTS, ENABLETASK, 1)
TASK(CLEAR COMMENTS, CHECKLCK, 1)
INFO(%ENABLETASK%, 1)
INFO(%CHECKLCK%, 1)
INFO(%SELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
INFO(%TOSELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
INFO(%APPSET%, ESMetrics)
INFO(%APP%, CONSOLIDATED)
INFO(%CONVERSION_INSTRUCTIONS%, )
INFO(%FACTCONVERSION_INSTRUCTIONS%, )
INFO(%SELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\FROM_51_.TMP)
INFO(%TOSELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\TO_51_.TMP)
INFO(%DEFAULT_MEASURE%, PERIODIC)
INFO(%MEASURES%, Periodic,QTD,YTD)
INFO(%OLAPSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
INFO(%SQLSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
INFO(%APPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\)
INFO(%DATAPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\DataFiles\)
INFO(%DATAROOTPATH%, C:\BPC\Data\WebFolders\)
INFO(%SELECTIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\SelectionFiles\)
INFO(%CONVERSIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\ConversionFiles\)
INFO(%TEMPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\)
INFO(%LOGICPATH%, C:\BPC\Data\WebFolders\ESMetrics\Adminapp\CONSOLIDATED\)
INFO(%TRANSFORMATIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\TransformationFiles\)
INFO(%DIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])
INFO(%FACTDIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID])
INFO(%CATEGORY_DIM%, [Category])
INFO(%TIME_DIM%, [Time])
INFO(%ENTITY_DIM%, [Affiliate])
INFO(%ACCOUNT_DIM%, [Account])
INFO(%CURRENCY_DIM%, )
INFO(%APP_LIST%, Consolidated,ES_INC,GrantMgmt,LegalApp,LRate,Ownership,Rate)
INFO(%ACCOUNT_SET%, DONOR_DART_ID_1)
INFO(%AFFILIATE_SET%, AZ_SWHD)
INFO(%CATEGORY_SET%, ACTUAL)
INFO(%FUNCTIONAL_SET%, BENCHMARK_F)
INFO(%REPORT_SET%, CONS)
INFO(%RESTRICTION_SET%, ANYRESTRICTED)
INFO(%TIME_SET%, 2010.C_SEP)
INFO(%ACCOUNT_TO_SET%, DONOR_DART_ID_1)
INFO(%AFFILIATE_TO_SET%, AZ_SWHD)
INFO(%CATEGORY_TO_SET%, ACTUAL)
INFO(%FUNCTIONAL_TO_SET%, BENCHMARK_F)
INFO(%REPORT_TO_SET%, CONS)
INFO(%RESTRICTION_TO_SET%, ANYRESTRICTED)
INFO(%TIME_TO_SET%, 2010.C_SEP)
INFO(DATAMGRGLOBALBPU, )
INFO(DATAMGRGLOBALCLIENTMACHINEID, ETSCWLT048794)
INFO(DATAMGRGLOBALERROR, )
INFO(DATAMGRGLOBALPACKAGEINFOR, )
INFO(DATAMGRGLOBALPACKAGENAME, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\PackageFiles\System Files/Clear.dtsx)
INFO(DATAMGRGLOBALSEQ, 51)
INFO(DATAMGRGLOBALSITEID, )
INFO(MODIFYSCRIPT, DEBUG(ON)<BR>PROMPT(SELECTINPUT,[CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'),,"SELECT THE MEMBERS TO CLEAR",[Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])<BR>PROMPT(RADIOBUTTON,1,"DO YOU WANT TO CLEAR COMMENTS ASSOCIATED WITH DATA REGIONS IN BPC?",1,{"YES","NO"},{"1","0"})<BR>PROMPT(RADIOBUTTON,1,"SELECT WHETHER TO CHECK WORK STATUS SETTINGS WHEN DELETING COMMENTS.",1,{"YES, DELETE COMMENTS WITH WORK STATUS SETTINGS","NO, DO NO DELETE COMMENTS WITH WORK STATUS SETTINGS"},{"1","0"})<BR>INFO(C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempfdla_51_.tmp)<BR>TASK(EXPORT_ZERO,APPSET,ESMetrics)<BR>TASK(EXPORT_ZERO,APP,CONSOLIDATED)<BR>TASK(EXPORT_ZERO,USER,NESSGROUP\tbardwil)<BR>TASK(EXPORT_ZERO,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(EXPORT_ZERO,SQL,
select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
)<BR>TASK(EXPORT_ZERO,DATATRANSFERMODE,2)<BR>TASK(LOAD CUBE,APPSET,ESMetrics)<BR>TASK(LOAD CUBE,APP,CONSOLIDATED)<BR>TASK(LOAD CUBE,USER,NESSGROUP\tbardwil)<BR>TASK(LOAD CUBE,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(LOAD CUBE,DATATRANSFERMODE,4)<BR>TASK(LOAD CUBE,DMMCOPY,0)<BR>TASK(LOAD CUBE,PKGTYPE,0)<BR>TASK(LOAD CUBE,CHECKLCK,1)<BR>TASK(CLEAR COMMENTS,APPSET,ESMetrics)<BR>TASK(CLEAR COMMENTS,APP,CONSOLIDATED)<BR>TASK(CLEAR COMMENTS,USER,NESSGROUP\tbardwil)<BR>TASK(CLEAR COMMENTS,DATATRANSFERMODE,0)<BR>TASK(CLEAR COMMENTS,SELECTIONORFILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(CLEAR COMMENTS,ENABLETASK,1)<BR>TASK(CLEAR COMMENTS,CHECKLCK,1)<BR>BEGININFO(
select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
)<BR><BR><BR><BR><BR><BR><BR>SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) AS ZEROTABLE GROUP BY [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)<BR><BR><BR><BR>ENDINFO<BR><BR><BR>)
Edited by: Tom Bardwil on Dec 28, 2010 5:20 PMYou can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) and the release (5.1, 7.0, 7.5) of BPC which you are using.
Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
Thanks and best regards,
[Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
SAP Labs, LLC
BusinessObjects Division
Americas Applications Regional Implementation Group (RIG) -
Error Assigning SID-cannot load data in ods
Hi
I have data in ODS coming from R/3.
Uptil today data was loading and working fine.
When i tried to load the data today, i am getting an error. Data is in PSA
Data is also seems to be in ods, because the request is in red color, but reporting icon is still available.
and Datamart status is empty.
I checked the call log and found the following error.
Overview of Call log is as follows
Activation of Data
SID Generation
Data package 3 : Error when assigning SID: Action VAL_SID_CONVERT InfoObject 0DATEFROM
Value '01012009' of characteristic 0DATE is not plausible
Data package 6: Error when assigning SID: Action VAL_SID_CONVERT InfoObject 0DATETO
Value '31122009' of characteristic 0DATE is not plausible 1
Rest all packages for the various infoobjects are loaded successfully
I think there is a problem in data, but i dont know how to resolve the issue
please help
RegardsHi,
The problem that you are facing is that your format is YYYYMMDD and you are trying to load data in format DDMMYYYY.
The error is the current request, you dont need to touch anything else. As i said earlier, goto the RSMO transaction, find your failed load. Click on it. On the right side of your screen, you will get the individual monitor, which has a status tab. On this tab, there would be a button which says error message. Click on it.
It will give you several messages. If you scroll to the right, there would be a ? symbol for long text. Click on it.
This will give you further info. One of these messages will have the record no and package no. You need to check all messages. Then go to your PSA, to the same package and record and check the date format
Another way is to goto SE16-> find the active table of your DSO, it will be named as /BIC/A<DSO NAME>40
Search the respective date field for the value thrown in the error, or you can use the record no obtained from above to find the value. Find the key fields and note them down.
You then need to delete the incorrect request from your DSO. You can then correct the incorrect record in your PSA ,using the key fields,save and reconstruct this request.
Hope this helps.
Regards.
Maybe you are looking for
-
Upgrade ASA Compatibility 8.4 to 9.1
Hi Does anybody knows if it is posible to have a mismatch in configuration if I upgrade from 8.4.5 to 9.1.5, this because the bug said that 8.4.7.15 and prior are vulnerable. This bug says that 8.4.7.6 is the fix, but I can find that in the download
-
Line Items in PO..
Hi Guys, I want to clarify few things. We are receiving Raw coal, Limestone, clincker...and have different PO for different Items. When material is coming on weigh bridge there is one line item created for every trip in the PO for the particular mat
-
Fatching Schema Password into Forms
We want to get information about fetching password of database user / password / connect string in to forms for dynamic connection . This will also facilitate in case of the password is changed by DBA and should reflect to the application dynamically
-
Can't change i4 phone apple id on the phone
I can't seem to change my i4 phone apple id which is my new email on my phone, how do I do this so I can update my phone ?
-
Is there such a thing as a Developer Install of Distiller Server?
I've heard that there used to be a developer install of Distiller Server available from Adobe - is this still the case? If so, how can one be acquired? Thanks, Jim