Query from 4.6c to ecc 6
HI
We are upgrading query from 4.6c to ecc 6. We are using Export/Import. When I am exporting query via RSAQR3TR. I am getting error
F Transport dataset AD1D910286 does not exist.
Pls help with solution.
Regards
Ajith
Hi,
you can try to read OSS note 352617 .
Ivan
Similar Messages
-
Asset query execution performance after upgrade from 4.6C to ECC 6.0+EHP4
Hi,guys
I am encounted a weird problems about asset query execution performance after upgrade to ECC 6.0.
Our client had migrated sap system from 4.6c to ECC 6.0. We test all transaction code and related stand report and query.
Everything is working normally except this asset depreciation query report. It is created based on ANLP, ANLZ, ANLA, ANLB, ANLC table; there is also some ABAP code for additional field.
This report execution costed about 6 minutes in 4.6C system; however it will take 25 minutes in ECC 6.0 with same selection parameter.
At first, I am trying to find some difference in table index ,structure between 4.6c and ECC 6.0,but there is no difference about it.
i am wondering why the other query reports is running normally but only this report running with too long time execution dump messages even though we do not make any changes for it.
your reply is very appreciated
Regards
BrianThanks for your replies.
I check these notes, unfortunately it is different our situation.
Our situation is all standard asset report and query (sq01) is running normally except this query report.
I executed se30 for this query (SQ01) at both 4.6C and ECC 6.0.
I find there is some difference in select sequence logic even though same query without any changes.
I list there for your reference.
4.6C
AQA0FI==========S2============
Open Cursor ANLP 38,702 39,329,356 = 39,329,356 34.6 AQA0FI==========S2============ DB Opens
Fetch ANLP 292,177 30,378,351 = 30,378,351 26.7 26.7 AQA0FI==========S2============ DB OpenS
Select Single ANLC 15,012 19,965,172 = 19,965,172 17.5 17.5 AQA0FI==========S2============ DB OpenS
Select Single ANLA 13,721 11,754,305 = 11,754,305 10.3 10.3 AQA0FI==========S2============ DB OpenS
Select Single ANLZ 3,753 3,259,308 = 3,259,308 2.9 2.9 AQA0FI==========S2============ DB OpenS
Select Single ANLB 3,753 3,069,119 = 3,069,119 2.7 2.7 AQA0FI==========S2============ DB OpenS
ECC 6.0
Perform FUNKTION_AUSFUEHREN 2 358,620,931 355
Perform COMMAND_QSUB 1 358,620,062 68
Call Func. RSAQ_SUBMIT_QUERY_REPORT 1 358,569,656 88
Program AQIWFI==========S2============ 2 358,558,488 1,350
Select Single ANLA 160,306 75,576,052 = 75,576,052
Open Cursor ANLP 71,136 42,096,314 = 42,096,314
Select Single ANLC 71,134 38,799,393 = 38,799,393
Select Single ANLB 61,888 26,007,721 = 26,007,721
Select Single ANLZ 61,888 24,072,111 = 24,072,111
Fetch ANLP 234,524 13,510,646 = 13,510,646
Close Cursor ANLP 71,136 2,017,654 = 2,017,654
We can see first open cursor ANLP ,fetch ANLP then select ANLC,ANLA,ANLZ,ANLB at 4.C.
But it changed to first select ANLA,and open cursor ANLP,then select ANLC,ANLB,ANLZ,at last fetch ANLP.
Probably,it is the real reason why it is running long time in ECC 6.0.
Is there any changes for query selcection logic(table join function) in ECC 6.0. -
Performance issue after Upgrade from 4.7 to ECC 6.0 with a select query
Hi All,
There is a Performance issue after Upgrade from 4.7 to ECC 6.0 with a select query in a report painter.
This query is working fine when executed in 4.7 system where as it is running for more time in ECC6.0.
Select query is on the table COSP.
SELECT (FIELD_LIST)
INTO CORRESPONDING FIELDS OF TABLE I_COSP PACKAGE SIZE 1000
FROM COSP CLIENT SPECIFIED
WHERE GJAHR IN SELR_GJAHR
AND KSTAR IN SELR_KSTAR
AND LEDNR EQ '00'
AND OBJNR IN SELR_OBJNR
AND PERBL IN SELR_PERBL
AND VERSN IN SELR_VERSN
AND WRTTP IN SELR_WRTTP
AND MANDT IN MANDTTAB
GROUP BY (GROUP_LIST).
LOOP AT I_COSP .
COSP = I_COSP .
PERFORM PCOSP USING I_COSP-_COUNTER.
CLEAR: $RWTAB, COSP .
CLEAR CCR1S .
ENDLOOP.
ENDSELECT.
I have checked with the table indexes, they were same as in 4.7 system.
What can be the reson for the difference in execution time. How can this be reduced without adjusting the select query.
Thanks in advance for the responses.
Regards,
Dedeepya.Hi,
ohhhhh....... lots of problems in select query......this is not the way you should write it.
Some generic comments:
1. never use SELECT
endselect.
SELECT
into table
for all entries in table
where.
use perform statment after this selection.
2. Do not use into corresponding fields. use exact structure type.
3. use proper sequence of fields in the where condition so that it helps table go according to indexes.
e.g in your case
sequence should be
LEDNR
OBJNR
GJAHR
WRTTP
VERSN
KSTAR
HRKFT
VRGNG
VBUND
PARGB
BEKNZ
TWAER
PERBL
sequence should be same as defined in table.
Always keep select query as simple as possible and perform all other calculations etc. afterwords.
I hope it helps.
Regards,
Pranaya -
Error while RRI from BW reports to ECC CJI3 transaction
Hi All,
When we try to set up RRI from BW reports to ECC CJI3 transaction, we get an error message (Stop message):
You cannot use the report-report interface to call up report RKREP000
Diagnosis
Line Item report RKPEP000 is not suitable for the report/report interface
Procedure
Always enter RKPEP000 as the receiver report for project line items in the report/report interface. This program uses the transfer parameters to determine which line item reports can be accessed.
<b>As per the other reply tried the ABAP Report RKPEP000/3 also ..no use ..
can you please suggest , how to go about this CJI3 issue for RRI , what settings need to be done in the RSBBS ...</b>
We have attemped to use "S_ALR_87013543" as well but the parameters don't go through properly which means when a user jumps from a BW report, they end up at the ECC selection screen of S_ALR_87013543.
Drill through is working for our Cost Center reports. Just not this one in particular.
Any advice would be greatly appreciated.
prasadHi,
I'm facing the same problem so please If you have found some solution please tell me what should I do to use RRI from BEX query to CJI3 transaction.
Regards,
Ana -
Strategy for ensuring replication is complete before initating query from source system
Hi,
I am using HANA as a side-car scenario with reports running in SAP ECC being accelerated by querying replicated tables in SAP HANA instead. This works well, however I don't have a good mechanism to validate before running a report whether the underlying data has already been replicated to HANA or is still queued up.
Often users would want to run the reports soon after large data changes have been made in the source tables. It is unknown, based on the overall workload, how long it might take for the most recently written records to get replicated to the corresponding HANA table.
What is a good practice approach to handle this. I have seen some separate threads on doing record counts between HANA and ECC. I think that is not a good idea at all. Firstly, for large tables, the time overhead of doing the record count is very large. In the time it takes for me to query the record count in ECC, the HANA report could have run 10 times over. But more importantly, for very large source tables, I may have opted to only replicate more recent data, leaving old historical data from ECC un-replicated to HANA.
I know this is not a new problem. SAP must have already addressed it a number of ways for their own delivered application accelerators. The COPA accelerator for instance must be doing something along these lines. Possibly querying most recent records in ECC and comparing them to the most recent records in HANA for the same tables might be a way to go.
Does anyone else have insights into how to best approach this? Does SLT expose a mechanism to check whether replication is completed for any given table?
thanks,
Nitin GoelHi Nitin,
SLT can never confirm whether replication is completed for any given table..Replication is a continous process,if records are there in the logging table of ECC then that will be replicated via SLT.
So the best way to check Replication completion is :
1. Go to SE16(ECC)--give the Logging table for which you want to check the replication.Copy the Logginng table from LTRC.If the number of entries is 0,then nothing is there in the logging table.
You can verify the replication in LTRC(SLT)--Expert functions to check replication is working fine or not.
2. Then go toi SE16(ECC)--Check the number of entries of the original table
3. Check in HANA--The number of entries should be same as the source table.
It takes a minute to verfify each table replication,not more than that.
Regards,
Joydeep. -
Ad Hoc Query testing when upgrading to ECC 6.0
Hi,
We have a decentralized SAP HR system. Instead of developing custom ABAP reports, we have given permission to superusers to create their own reports using Ad Hoc Query. This has lead to large number of Ad Hoc Queries (>2000). Techincally speaking, lot of custom ABAP programs.
We are now upgrading from 4.6C to ECC 6.0.
Based on your upgrade experience please advise:
1. Does the upgrade has any affect on the Adhoc Queries. Example: they would not work, need regenerate all queries, PNP or PCH logical databases are missing, Global infosets are missing, etc
2. Do you have to test all the queries or checking just a few critical ones suffice your testing requirements.
Your insight is greatly appreciated. Points will be awarded.
Thanks
PrafulHi,
In terms of upgradation, you need to take care of all Z programs, i don't think it will afftect your Ad hoc report, nevetheless you have used any custom fields in the report. Always you can reconsile the reports during the upgradation.
Good Luck
Om. -
What is the impact of R/3 upgradation from 4.7 to ECC 6 on BI
Dear all,
Can any one tell me what will be the impact of R/3 Upgradation on BI... as we are shortly going to upgrade our R/3 from 4.7 to ECC 6.
Do we need to take any precautions in R/3 and aswell as BI
Please give the information...points will be given
Regards
venuHi
Please Refer this as this vll give u the Pros and Cons for upgrade.
Refer
http://wiki.ittoolbox.com/index.php/Upgrade_BW_to_Netweaver_2004s_from_v3.0B
This Wiki contains Rob Moore's ( BW Manager, Edwards Limited) teams experiences in upgrading Business Warehouse System from 3.0B to BW 7.0.
Contents
1 Upgrading from BW 3.0B to BW 7.0 (Netweaver 2004s)
2 Introduction
3 Overview & Scope
4 Drivers
5 Environment
6 Resource & Timescales
7 High Level Plan
8 Summary Task List
8.1 #Support Pack Hike
8.2 #Plug-in installation
8.3 #PREPARE process
8.4 #Dbase upgrades
8.5 #System Upgrade
9 Lessons Learnt
10 Issues & Fixes
10.1 Unfixed Issues
10.2 Fixed Issues
11 Regression Testing Process
11.1 Introduction
11.2 Set up
11.3 Actions
11.4 Security
12 Transport Freeze
13 Web Applications
13.1 Dashboards
13.2 Internet Graphics Server (IGS)
14 Detailed Task Lists
14.1 Support Pack Hike (detail)
14.2 Plug-in installation (detail)
14.3 Dbase upgrades (detail)
14.4 PREPARE Process (detail)
14.5 System Upgrade (detail)
Upgrading from BW 3.0B to BW 7.0 (Netweaver 2004s)
Introduction
This Wiki contains my teams experiences in upgrading our Business Warehouse System from 3.0B to BW 7.0.
Hopefully it will be useful to anyone else who's about to embark on this. If there's anything I've missed or got wrong, then please feel free to edit it or contact me and I'll try to explain.
Rob Moore - BW Manager, Edwards Limited.
Overview & Scope
This was to be a technical upgrade of BW only. The new BW 7.0 web functionality & tool suite which requires the Java stack rather than the ABAP stack was out of scope. We had heard that the latter was where most of the problems with the upgrade lay. Our plan is to wait for this part of BW 7.0 to become more stable. Also it has a big front end change and the business didn't have sufficient resource to cope with that much change management.
Drivers
3.0B at the end of its maintenance
Opportunities to do better reporting
Options to utilise BI Accelerator
Environment
Our R/3 system was at 4.6C and was not going to be upgraded. We have APO at version SCM4.0.
Our BW system is approximately 300 GB, with 125 global users. It was at version 3.0B SP 18
We have Development, Acceptance and Production environments.
Resource & Timescales
The Project ran for 3.5 months from Feb to May 2007. We used the following resources. The percentages are the approx. amount of their time spent on the project.
Project Manager * 1 70%
BW technical team * 3 50%
ABAP coder * 1 10%
SAP Systems Development expert * 1 20%
Basis * 1 25%
High Level Plan
These are the basic areas. We planned to complete this process for each environment in turn, learning our lessons at each stage and incorporating into revised plans for the next environment. However we did the Support Packs and Plug-Ins in quick succession on all environments to keep our full transport path open as long as possible.
Upgrade BW to the minimum support pack.
Install R/3 Plug-ins PI 2004.1
Run PREPARE on BW
Dbase upgrades (Database & SAP Kernel upgrade)
System Upgrade
Summary Task List
This list contains all the basic tasks that we performed. A more detailed check list is shown below for each of the headings.
#Support Pack Hike
We moved only to the minimum acceptable SP as this seemed most likely to avoid any problems with the 3.0B system prior to the upgrade.
Apply OSS 780710 & 485741
Run Baseline for Regression tests
Full Backup
Apply SP's (we were at SP18, going to SP20)
SPAU list review
Regression testing
#Plug-in installation
Apply SAP note 684844
Import and patch Basis plugin PI 2004.1
SPAU list review
#PREPARE process
BW Pre-Prepare tasks
BW - Inconsistent Data fix
Run PREPARE
Review of results
Any showstoppers from PREPARE?
#Dbase upgrades
Database Upgrade (FixPak)
SAP Kernel Upgrade
#System Upgrade
Reverse Transport of Queries
Reconnect DAB & SAB to AAE
Run Baseline for Regression tests
Full Backup
Run the Upgrade
SPAU list review
Regression testing
Lessons Learnt
Testing is all! We picked up on a lot of issues, but would have picked up more if we'd had a full copy of our production environment to test over.
Our approach of doing a full upgrade on each environment before moving to the next one paid dividends in giving us experience of issues and timescales.
Write everything down as you go, so that by the time you get to upgrading production you've got a complete list of what to do.
We succeeded because we had people on our team who had enough experience in Basis and BW to be able to troubleshoot issues and not just read the manual.
The SAP upgrade guide is pretty good, if you can understand what they're on about...
Remember the users! The fact that the loads have been successful doesn't count for anything unless the users can see the data in Excel! There's a tendency to get caught up in the technology and forget that it's all just a means to an end.
Issues & Fixes
I've listed the main issues that we encountered. I have not listed the various issues where Transfer rules became Inactive, DataSources needed replication or we had to reinstall some minor Business Content.
Unfixed Issues
We could not fix these issues, seems like SP 13 will help.
Cant delete individual request from ODS
After PREPARE had been run, if we had a load failure and needed to set an ODS request to Red and delete it, we found that we could not. We raised an OSS with SAP but although they tried hard we couldn't get round it. We reset the PREPARE and still the issue persisted. Ultimately we just lived with the problem for a week until we upgraded production.
Error when trying to save query to itself
Any query with a re-usable structure cannot be saved (more thsan once!)
OSS 975510 fixed the issue in Dev and Acc, but NOT in Production! SP 13 may solve this once it's released.
Warning message when running some queries post-upgrade
Time of calculation Before Aggregation is obsolete. Not a big issue so we haven't fixed this one yet!
Process Chain Scheduling Timing error
Process chains get scheduled for the NEXT day sometimes and has to be manually reset.
See OSS 1016317. Implement SP13 to fix this. We will live with it for now .
Fixed Issues
Duplicate Fiscal Period values in Query
If you open up a drop down box ("Select Filter Value") for Fiscal Year/Period to filter your query, you are presented with duplicate entries for Month & Year.
Due to Fiscal Year Period InfoObject taking data from Master Data not InfoProvider. Thus it picks up all available periods not just Z2.
Auto-Emails being delayed
Emails coming from BW from process chains are delayed 2 hours on BW before being released
Due to userids that send these emails (e.g. ALEREMOTE) being registered on a diffferent timeazone (i.e. CET) from the BW system (i.e. GMT)
Pgm_Not_Found short dump
Whenever a query is run via RRMX or RSRT
Call transaction RS_PERS_ACTIVATE to Activate History and Personalisation
Characteristics not found
When running a query the warning message Characteristic does not exist is displayed for the following: 0TCAACTVT, 0TCAIPROV, 0TCAVALID
We activated the three characteristics listed and the warnings stopped. NO need to make them authorisation-relevant at this stage.(also did 0TCAKYFNM)
System generated Z pgms have disappeared
Post-upgrade the system Z-pgms ceased to exist
Discovered in Development so we compared with pre-upgraded Production and then recreated them or copying them from production.
Conversion issues with some Infoobjects
Data fails to Activate in the ODS targets
For the InfoObjects in question, set the flag so as not to convert the Internal values for these infoobjects
InfoObject has Conversion routine that fails, causing load to fail
The routine prefixes numeric PO Numbers with 0s. SD loads were failing as it was not able to convert the numbers. Presumably the cause of the failure was the running of the Pre-Prepare RSMCNVEXIT pgm.
Check the Tick box in the Update rule to do the conversion prior to loading rather than the other way round.
Requests fail to Activate on numeric data
Request loads OK (different from above issue) but fails to Activate
Forced conversion within the update rules using Alpha routine. Deleted Request and reloaded from PSA.
Database views missing after pre-PREPARE work
Views got deleted from database, although not from data dictionary
Recreated the views in the database using SE14.
Workbook role assignations lost
We lost a few thousand workbook assignments when we transported the role they were attached to into Production
The workbooks did not exist in Development, thus they all went AWOL. We wrote an ABAP program to re-assign them in production
Regression Testing Process
Introduction
We were limited to what we could do here. We didn't have a sandbox environment available. Nor did we have the opportunity to have a replica of our production data to test with, due to lack of disk space in Acceptance and lack of sufficient Basis resource.
Set up
We manually replicated our production process chains into test. We didn't have any legacy InfoPackages to worry about. We asked our super-users for a list of their "Top 10" most important queries and did a reverse transport of the queries from Production back into test (as we do not generally have a dev/acc/prodn process for managing queries, and they are mostly created solely in prodn). We made sure every application was represented. In retrospect we should have done some Workbooks as well, although that didn't give us any problems.
Actions
Prior to the various changes we loaded data via the Process chains and ran the example queries to give ourselves a baseline of data to test against. After the change we ran the same queries again and compared the results against the baseline. We tried to keep R/3 test environments as static as possible during this, although it wasn't always the case & we often had to explain away small changes in the results. After upgrading BW Development we connected it to Acceptance R/3, so that we had pre-upgrade (BW Acceptance) and post-upgrade (BW Development) both taking data from the same place so we could compare and contrast results on both BW systems. We did the same thing once BW Acceptance had been upgrading by connecting it (carefully!) to Production R/3. To get round the lack of disk space we tested by Application and deleted the data once that Application had been signed off. Once we got to System test we involved super-users to sign off some of the testing.
Security
We chose to implement the new security schema rather than choosing the option to stick with old. For us, with a relatively small number of users we felt we could get away with this & if it all went wrong, just temporarily give users a higher level role than they needed. Our security roles are not complex: we have end user, power-user and InfoProvider roles for each BW application, together with some common default roles for all. In the event we simply modified the default "Reports" role that all our users are assigned, transported it and it all went smoothly. Apart from the fact that everyone's workbooks are assigned to this role and so we "lost" them all !
Transport Freeze
Once you've upgraded Development you've lost your transport path. We planned around this as best we could and when absolutely necessary, developed directly in Acceptance or Production, applying those changes back to Development once the project was complete. Depending on what other BW projects you have running this may or may not cause you pain!
Web Applications
Dashboards
We had various dashboards designed via Web Application Designer. All these continued to function on the upgraded system. However there were various formatting changes that occurred e.g. Bar graphs were changed to line graphs, text formats on axes changed etc. SAP provides an upgrade path for moving your Web applications by running various functions. However we took the view that we would simply re-format our dashboards manually, as we didn't have very many to do. Plus the external IGS (see below) powered all our environments and needs upgrading separately as part of the SAP method. Thus we couldn't have tested the SAP path in Development without risking Production. Sticking with manual mods was a lower risk approach for us. We did find we had to re-activate some templates from BC to get some of the reports to continue to work.
Internet Graphics Server (IGS)
We had an external IGS server with v3.0B. Post-upgrade the IGS becomes part of the internal architecture of BW and thge external server is redundant. We found no issues with this; after the upgrade BW simply stops using the external IGS and no separate config was needed.
Detailed Task Lists
Support Pack Hike (detail)
Apply OSS 780710 & 485741
Communicate outage to users
Warning msg on screen
Stop the jobs which extract data into delta queues
Clear the delta queues by loading into BW
Check RSA7 in PAE that delta queues are now empty
Run Baseline for Regression tests
Stop Delta queues
Lock Out users
Full Backup
Apply SP's
Upgrade the SAP kernel from 620 to 640
SPAU list review
Apply OSS Notes 768007 & 861890
Unlock Test users (inc. RFC id's on R/3)
Regression testing
Regression sign-off
Remove warning msg
Unlock users
User communication
Plug-in installation (detail)
Communicate outage to users
Warning msg on screen
Apply SAP note 684844
Lock out users
Full Backup
Empty CRM queues
Import and patch Basis plugin PI 2004.1
SPAU list review
Apply OSS 853130
Switch back on flag in TBE11 (app. BC-MID)
Remove warning msg
Unlock users
User communication
Dbase upgrades
Dbase upgrades (detail)
Communicate outage to users
Warning msg on screen
Run Baseline for Regression tests
Stop the Data extract jobs
Full Backup
Lock Out users
Apply FixPak13SAP to DB2 database
Upgrade the SAP kernel from 620 to 640
Apply OSS 725746 - prevents RSRV short dump
Unlock Test users (inc. RFC id's on R/3)
Regression testing
Regression sign-off
Remove warning msg
Unlock users
User communication
PREPARE Process (detail)
Pre-PREPARE Process
RSDG_ODSO_ACTIVATE
Repair Info objects and recreate the views
Communicate outage to users
Warning msg on screen
Run Baseline for Regression tests
Stop the Data extract jobs
Lock Out users
Full Backup
Run RSMDCNVEXIT Using BSSUPPORT ID
If there conversion process runs longer delay the regular backup
Re-run Baselines and sign off
If there conversion process Fails repeat the steps on Sunday after the regular backup
BW work
Back up customer-specific entries in EDIFCT (note 865142)
Activate all ODS objects
Execute report RSUPGRCHECK with flag "ODS objects" (note 861890)
Check Inconsistent InfoObjects - Upgr Guide 4.4; OSS 46272; Convert Data Classes of InfoCubes - Upgr Guide 4.3
Execute report SAP_FACTVIEWS_RECREATE (note 563201)
Make sure Delta queues are empty (RSA7)
Basis Work
Full Backup of BW
Lock users
Unlock id's RFC id's and designated test users
Confirm backup complete OK
Apply OSS 447341 Convert Inconsistent Characteristic Values - Upgr Guide 4.5
Confirm OK to PREPARE
Run PREPARE
Review of results
System Upgrade (detail)
Communicate outage to users
Process errors from PREPARE
Check disk space availability for PAB
Warning msg on screen
Reverse Transport Queries from PAB to SAB & DAB
Change BW to R/3 connection
Get backup put on hold, for Ops to release later
Ensure Saturday night backup is cancelled
Final run of PREPARE
Run Baseline for Regression tests
Clear Delta Queues
Delete Local Transports
Remove process chains from schedule
Lock Out users (with some exceptions)
Confirm to Ops and Angie that we're ready
Full Backup
Incremental backup of Unix files
UPGRADE "START"
Back up kernel
Unpack latest 700 kernel to upgrade directory
Check ids required for upgrade are unlocked
Check no outstanding updates
Turn off DB2 archiving
Open up the client for changes
Stop saposcol, & delete from exe directory
Run the Upgrade
Execute the saproot.sh script
Perform the database-specific actions
Perform follow-up activities for the SAP kernel
Reimport additional programs
Import Support Packages
Call transaction SGEN to generate ABAP loads
Transport Management System (TMS)
Handover to BW Team
SPAU list review
Apply OSS Notes from SPAU
Process Function Module XXL_FULL_API (part of SPAU)
Restore table EDIFCT (if required)
Transport Fixes from our BW Issue list
Convert the chart settings - manually
Perform activities in the Authorization Area
Activate hierarchy versions
Update the where-used list
Execute the conversion program for the product master
Transport New roles to PAB
Unlock selected users
Regression testing
Regression sign-off
Go / No Go decision
Restore if required
Remove warning msg
Tell Ops that the CR is now complete
Lock down PAB
Unlock users
User communication
Perform the follow-up activities for SAP Solution Manager
Reschedule background jobs
Reschedule weekly backup on Ctrl-M
Drinks all round.
Vendors mentioned: dBase
Hope this helps. -
Error while deploying a PAR file from NWDS into an ECC.
Hi all,
I am getting this error while deploying a PAR file from NWDS into an ECC.
Operation Failed: Please make sure the server is running or check the log (sap-plugin.log) for
more detail.
My server is running properly
1 - Where is sap-plugin.log file? I don´t find it.
2 - Could there be another file with another name with information about the error?
3 - Is there another way to deploy the file directly from the ECC?
Regards,Hi,
Just make sure you have maintained correct server setting to check the same open the NWDS and follow this path
Windows/ Prefereces / SAP Enterprise Portal
Check the following enteries
Alias
Host
Port
Login etc.
Regards, -
Data Migration from Legacy system to ECC systems via ETL through SAP PI
Hi All,
I wanted to know if we can migrate the data from Legacy systems to ECC systems via PI.
What I understand is there is ETL tool is used to extract the data from legacy system, and what client is looking to load that data via PI?
Can we do that ? I am concerned because this will involve mass data?
If I have to use PI , I see option of PROXY / IDOC /FILE as receiver in ECC system (take the data from ETL)?
Can somebody has done this earlier , please share the approach.
Thanks,
Pushkar PatelHi ,
I require few details from you.
1. What ETL tool you are using, If Informatica, it already have PowerConnect to connect to SAP. So you can create source and Target Structure and also you can use RFC's to send data to R/3. Else, for other ETL tools, can you prepare RFC's or any other way to send data to R/3. let me know the tool.
2. Does R/3 contains the master data tables? If yes, then try to use LSMW for Mass upload of data to tables.
If your client don't want to use either of these options please elaborate, what is the case.
Regards
Aashish Sinha -
Interview questions on upgrades from 4.7 to ECC
Hi,
Can any one please post some potential areas/questions that will be asked for upgrade projects from 4.7 to ECC6.0
Points will be awarded.
ThanksHi Frank,
this is good question, why because sap upgrades from 4.6b to ecc 6.0 is very risky.
take one example , in 4.5b , me21n application having 620 screen where as in ecc 6.0 having 9000 screen . that time we will take care of it.
check the obsolete function modules , 4.5b , having ws_upload, ws_download.... use cl_gui_frontend_services , under these having differnt type of methods.
once you change the code in ecc , go to tranx : SLIN check the error where it is exact. -
Is it possible to open a query from sap menu favorite to excel?
Hi all
I have added a query in favorite . From the favorite in sap menu , i want to open the query in excel.
Is is possible?
Currently the query open in web.
Please revert back if you have any idea
Thanks
ajayHi Daya Sagar,
You said the query saved in favorite from query designer will open in browser and the one saved from analyzer will open in bex excel. But it does not work for me. It saves as web icon and both query is opening in browser.
Hi All,
The different story for me is I have problem of opening the web browser by running a query from favorite or user menu because in the url I have different Client. Is there any way to fix this url problem. Any one in this thread can help me on this issue and will be awarded. Thanks.
Raj -
See sql query from crystal report without crystal report
see sql query from crystal report without crystal report
Hi,
Depends on datasource type but you could have a look at ODBC trace or if you have access to the SQL Server you could use profiler to monitor the session.
Regards,
Craig
And this will only be of use if you know which Server/Insstance/Database the Report is connecting to...
Please click "Mark As Answer" if my post helped. Tony C. -
How to call a BW Query from an ABAP program?
How to call a BW Query from an ABAP program?
hi
check this link
/people/durairaj.athavanraja/blog/2005/04/03/execute-bw-query-using-abap-part-i
/people/durairaj.athavanraja/blog/2005/04/03/execute-bw-query-using-abap-part-ii
/people/durairaj.athavanraja/blog/2005/12/05/execute-bw-query-using-abap-part-iii
hope this helps
cheers -
RC 8 error while transporting the query from Dev to Qa - Element missing
Hi All,
I have made a multiprovider by copying one existing one.
I have copies one of the Bex queries using RSZC from the source to Target MP.
While transporting the query from Dev to QA i have got the following error. (I have already moved the Multi Prov in a previous TR)
Has anyone seen similar types before. Any inputs will be highly appreciated.
Start of the after-import method RS_ELEM_AFTER_IMPORT for object type(s) ELEM (Activation Mode)
Error when activating element 4LUSZK561QN74UDBJ7BPU9GFM
Element 4LUSZJXHIS1HM7TVDD9DK7HPU is missing in version M
End of after import methode RS_ELEM_AFTER_IMPORT (Activation Mode) - runtime: 00:00:06
Start of the after-import method RS_ELEM_AFTER_IMPORT for object type(s) ELEM (Delete Mode)
End of after import methode RS_ELEM_AFTER_IMPORT (Delete Mode) - runtime: 00:00:14
Errors occurred during post-handling RS_AFTER_IMPORT for ELEM L
RS_AFTER_IMPORT belongs to package RS
The errors affect the following components:
BW-WHM (Warehouse Management)
Post-import method RS_AFTER_IMPORT completed for ELEM L, date and time: 20110517070045
Post-import methods of change/transport request BDAK959603 completed
Start of subsequent processing ... 20110517070025
End of subsequent processing... 20110517070045
Kind regards,Hi Jith,
First of all try to consolidate all your objects in a single TR and then move to Q from D.
In this case check for the list of objects that you have in your TRs. If TR 3 have all the Objects that were in TR1, TR2, then transporting TR3 alone will work.
Also, you can find the information related to the elements that were missed in your TR1,TR2 by following process.
1. Go to your transport Logs and then to the entries marked as Error.
2. There you will find the Query Element Ids, copy them
3. Now, go to Table(Se16) RSZELTDIR, there enter those query elements.
Now, you can able to find the elements that you have missed in your TRs. Hope this helps you. -
0ADHOC error while executing the query from Report Designer
Hi All,
When I am executing The Query from Report Designer, am getting the below error.
Please help me in this regard.
The initial exception that caused the request to fail was:
The Web template "0ADHOC" does not exist in the master system
com.sap.ip.bi.base.exception.BIBaseRuntimeException: The Web template "0ADHOC" does not exist in the master system
at com.sap.ip.bi.webapplications.runtime.service.template.impl.TemplateService.getTemplateContent(TemplateService.java:57)
at com.sap.ip.bi.webapplications.runtime.jsp.portal.service.template.PortalTemplateAccessService.getTemplateContent(PortalTemplateAccessService.java:82)
at com.sap.ip.bi.webapplications.runtime.preprocessor.Preprocessor.parseTemplate(Preprocessor.java:163)
at com.sap.ip.bi.webapplications.runtime.xml.XmlTemplateAssembler.doInit(XmlTemplateAssembler.java:79)
at com.sap.ip.bi.webapplications.runtime.template.TemplateAssembler.init(TemplateAssembler.java:133)
Thanks,
KVRHi,
The web template 0ADHOC is usually the standard web template for 3.x queries. The Report Designer executes the reports in java web, thus tries to find the template 0ADHOC in the 7.0 templates and does not find it. Check whether you have maintained the standard web template 0ADHOC for 7.0 reports also:
Transaktion SPRO -> SAP Reference Image -> SAP NetWeaver -> Business Intelligence -> Settings for Reporting and Analysis -> Bex Web -> Set Standard Web Templates
Best regards,
Janine
Maybe you are looking for
-
Select count(*) never ever ends
Hi, I loaded a 22 million row table and added an index (non-unique) in a 10.2 database. I then ran a 'Select count(*) from ' this table and it never ends. Same result for counts on any column. SQL Developer just hangs and other gui(s) never stop chur
-
Download for elements 9 with only a serial number
can I download elements 9 with only a serial number? my disk for pc has been "lost"?
-
error message: 10661
-
Hyperion Planning 9.2.1 with Tomcat . When there is more than 50 to 80 concurrent users accessing the Planning application, the planning server is hanged, and has to restart the Planning services. Anyone encountered that kind of issue? The server is
-
How to purchase "Movies" through the Canadian iTune store?
Could some one please tell me how to purchase a "movie" through the Canadian iTunes store? The "Movies" icon only available in the United States iTune store. I tried to set up an account through this US store but was unable due to "no option" for Can