BW upgrade EHP1, data uploads should stop?

Dear experts,
we have plan for system upgrade. current system is BW 7.0 sp 17, Now planned for EHP1 and SP9.
I know the there some post upgrade activities, which include consistancy check for objects (infoobject, transfer rules, cubes,dso, etc...)
Could some one pls confirm, do we need to stop the data uploads/stop process chains during system upgrade?
Thanks in advance!
Best Regards,
Mannu

Hi Ingo,
RSRT was giving proper results. We have now implemented few SAP notes and the issues got resolved.
The following are the notes:
1499233 - MDX:bXML flattening, unbalanced hierarchy, empty columns
1485648 - MDX: bXML flattening and hierarchies and displaced columns
1446245 - MDX: Error when RSR_MDX_BXML_GET_GZIP_DATA is called
1441767 - MDX: No data for bXML if only ALL member is requested
1438091 - MDX: basXML: Object MEASURE 0 not found
1435844 - MDX:Wrong no. decimal places for basXML flattening interface
1432162 - MDX: Flattening problems when using hierarchies
1420169 - MDX: bXML flattening: Subsequent note
1411491 - MDX: bXML flattening in transac. MDXTEST: Selecting packages
1404328 - MDX: bXML flattening and PROPERTIES: Columns overwritten
Thanks for your inputs.
Regards,
shesha.

Similar Messages

  • What are the Master data that should be uploaded for Go live

    Can any one tell me what are the SD master data that should be uploaded during GO live

    Hi Prem,
    SD Master data includes:
    - Customer Master data
    - Material Master data
    - Customer-Material Info records
    - Enterprise Structure info (Dist Channels,Divisions etc)
    - Condition Master Data in Pricing.
    I believe, all these need to be uploaded during GO Live.
    regards,
    Raj

  • Data Upload throgh LSMW

    Hi All
    I want to upload vendor transaction through T code: F-43, with tax items
    Could you help me in the following points:
    1.  Fields to be taken in Structure
    2. How to handle the WHT and Tax as these are automatically deducted during the posting
    3. Sample sheet for data upload
    Thanks in advance
    SAPUSER5

    Hi SAPUSER5,
    Please clarify, whether your question is related to Cut Over activities or day to day activities..
    If it is Cut over:-
    > In Cut Over activities, you should not upload with TDS tax codes assigned in the vendor master active, because, in that case TDS gets posted automatically posted when you post to a Vendor Account. There is a workaround for stopping TDS posting .. Do let us know your requirement.
    > Routine - You don't need to enter TDS codes in the upload sheet.. It will get calculated automatically when you post to a Vendor Account.
    Do let me know if you have more questions on this..
    Regards,
    SAPFICO

  • Best data upload protocol.

    HI Everyone,
    We are managing a JSP based application in which we uploads data using HTTP Protocol. Daily data uploads are in GB's. the problems that we are facing are:-
    Memory Heap size.
    Request timeout errors of our web server.
    No Resume support in case of network error.
    Now we want to upgrade our application to get following benefits:-
    Get rid of above mention problems.
    Reduce data upload time.
    Resume support in case of network errors.
    I have following questions in this regard:-
    What should be the Data upload protocol ? HTTP OR FTP.
    What should be the Application interface for data uploading module ? like Java Applet, JSP OR Java Desktop application etc .
    To resolve Memory Heap size errors we are using cronjob to restart web server, is it OK ? OR there is some other batter solutions for this.
    Your value able advice is required to make correct decision.
    Regards.

    There is no best. What CAN you use? If FTP is an option (and make it SFTP) then of course go for that, then you don't have to be bothered by the clunky HTTP file upload.
    > What should be the Application interface for data uploading module
    How about a proper existing SFTP client? There is absolutely no reason at all why that should be anything Java related.
    > To resolve Memory Heap size errors we are using cronjob to restart web server, is it OK ? OR there is some other batter solutions for this.
    How about finding and solving the source of your memory leaks?

  • Database Upgrade in Data Gaurd

    Hi all,
    i have an active data guard running
    i wont to upgrade my database what should i do?? should i stop my dataguard
    shutdown both primary database and secondary database
    upgrade primary first then secondary database, then i'll make again dataguard  ??
    or any other best recommendations ??

    Hi CKPT,
    Can you help me to find the best method how to upgrade 10.2.0.4 to 11.2.0.3 with physical standby database, in my opinion :
    1. Stop applying redo log
    2. Stop all standby service
    3. Upgrade standby
    4. Switchover to upgrade standby
    5. Upgrade old primary if no error in step 4
    6. Switchback
    Please give any suggestion to best method and minimal downtime, and note i have more 4TB data, thank's.

  • Function module Vs BDC for master data upload

    Hi ,
    Please advice we should use the following function modules for master data upload or we should go for BDC.
    MP_RFC_SINGLE_CREATE
    MP_RFC_INACT_CHANGE
    MPLAN_CREATE
    MPLAN_CHANGE
    MPLAN_SET_DELETION_INDICATOR
    ASSET_MASTERRECORD_MAINTENANCE
    MPLAN_ITEM_CREATE
    MPLAN_ITEM_CHANGE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    Actually, we have already used these function modules in our upload program, but we are not sure if these function modules will create any data inconsistency.
    Please let me know, if we should continue using the FMs, or there is any risk using the FMs and we should replace them by BDC.
    Thanks in advance.

    HI Vikram,
    Better to serch for the BAPI for uploading the master data.Becuase we have problems with BDC and FM's.
    If you use FM's it does n't contain all the fields which you want.IF you go for BDC this is not maintainable for future releaseas.IF you upgrade then screen may change.
    IF don' have any BAPI then better go for BDC.
    Thanks

  • I upgraded some things and Quicktime stopped playing MPEG-2 movies (black screen)!

    I upgraded some things and Quicktime stopped playing MPEG-2 movies (black screen)!
    I need help, I was ending a project and now it just plays black!
    I have quicktime X.
    HELP! PLEASE!
    thank you.

    Quicktime was working fine, I am not sure what Quicktime it was. I upgraded Java, I am not sure if I upgraded Quicktime. After that all quicktime movies with "MPEG-2 Video, Linear PCM, Timecode" codecs started playing black with audio in the background. iMovie was not used. Thank you, I am beyond desperate and I am starting to think no one can help me with this.
    Then I am not sure how your created the MPEG-2/PCM MOV files. Files not imported via the MPEG-2 Import codec would normall remain in their long GOP compression format which it not edit compatible in QT based video editors. Those that are imported via the codec to make them edit-compatible then need the codec for proper playback in QT based players/editors.
    Since you now say that "Quicktime is working fine," am I to assume that it is After Effects that is playing "black" screen videos and that the files play fine in the QT players? Since you also mention the files are time coded, am I to assume you used FCP or some other app to import the files (which may use the same sort of import workflow) or am I to assume it is the timecode data itself that you feel is now causing the problem. In either case, a Java update may have been a security update that is now causing your current issue by disallowing access to the required playback component for some unknown reason which Apple is unlikely to divulge if it refers to a software security risk.
    If problem is general on your system, then do you have a short sample file that you can upload for testing purposes on my Snow Leopard system? At the very least, this may confirm if the problem is general to all updated platforms or local to your individual system and/or user account.

  • Legacy asset data upload for multiple line items

    Hello
    Legacy asset data upload for multiple line items- for example Building is an asset which is having different line items for purchasing of land,constructaion or renovation etc.........now to upload the legacy data what should be consider...only one line item for Building or numbers of line item.........
    which one is the proper way to do this exercise?
    regards

    Hi,
    It completely depends on clients requirement. but following are few approaches:
    1. Define Building as Asset Class.
    2. Create a Building at specific location at asset code and
    3. create asset sub numbers for other components.
    other approach would be:
    1. Define Building as Asset Class.
    2. Create a Building at specific location at asset code and
    3. create further asset codes for other components where field description will be used for relating each other.
    Regards,
    Sayujya

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • G4 and panther upgrade. could I? should I?

    I purchased a G4 Powerbook in 2003 and never updated the operating system. Over the past year it has been running slower and slower and now can not access many webpages and Safari quits often. So I have a few questions.
    Here's my specs (as they read on the side of the box it came in)
    1 GHz/512MB/60G/Combo/BT/APX
    Now first question is, I ordered 512MB when I purchased but when I look at 'about' or system specs, it says I only have 256MB of RAM and the other slot is 'empty'. Which is correct? Could I have never received my proper RAM?
    Next question, how can I upgrade the operating system and will this improve computer function? I have Mac OS X 10.3.9 now with Safari 1.3.2(v312.6). And should I upgrade? Could it cause more problems than solutions? I'm not a tech expert, so I'm concerned I could cause more bad than good.
    I mostly use the computer for internet/email, word processing and running Adobe CS1 (mostly photoshop).
    Ok, any help is greatly appreciated. Thank you.

    Welcome To  Discussions Otist
    "...has been running slower and slower..."
    How much free space is available on the 60 GB HD?
    Do you run routine maintenance?
    Insufficient available space, can cause performance issues, system corruption, and possible loss of data.
    Depending on Mac system usage habits, it is a general recommendation, to keep 10% to as much as 20%, of the Total capacity, available at all times.
    "...can not access many webpages and Safari quits often."
    The version of Safari that Panther supports is too outdated for newer Web technology.
    "...only have 256MB of RAM..."
    Possibly the extra stick is not being recognized.
    You could run the Apple Hardware Test or use the utility Rember to test it, but I'm not sure what that will tell you if indeed the second slot is empty.
    You may have to physically open the case to make that determination
    I have very limited knowledge about RAM and Hardware in general.
    Another member will augment my respone.
    "...upgrade the operating system and will this improve computer function?"
    I believe so if you clean up the present system and Increase the RAM to at least 1 GB.
    You may also want to consider a larger Hard Drive
    "...should I upgrade?"
    Absolutely!
    The PowerBook G4/ 1 GHz will support the Leopard System Requirements.
    The Full Retail Version, of the Leopard Install DVD, can be purchased at The Apple Store (U.S.), Apple retail stores, Apple resellers, and some Online vendors.
    If you know what to look for, a Full Retail Version, of the Leopard Install DVD, can be purchased, sometimes less expensively, at some online Apple retailers, Amazon, eBay, FastMac, HardCore Mac, etc.
    Be sure not to purchase grey, upgrade or machine specific CDs or DVDs.
    Leopard is on DVD.
    The disc should look exactly like the images in the above links, and not say Upgrade, CPU Drop-in DVD, or "This software is part of a hardware bundle purchase - not to be sold seperately." on it.
    Additional info in these links.
    Using OS X Install CDs/DVDs On Multiple Macs
    What's A Computer Specific Mac OS X Release
    Software Update, Upgrade: What's The Difference?
    Caveat Emptor!
    If any are presently available, examine these items very carefully, and if in doubt, ask questions of the seller before purchase!
    Leopard On eBay
    Once Leopard 10.5.x is installed, you can use the Mac OS X 10.5.7 Combo Update.
    You may also find this Leopard Installation and Setup Guide PDF useful.
    The posted RAM System Requirement, is the bare minimum.
    For optimum performance, more is recommended.
    As a precaution, before upgrading the OS, you should create a reliable backup of the entire system, or at least, any data you do not wish to lose or corrupt.
    In addition to the Archive & Install procedure, there is also a Simple Upgrade, or an Erase & Install option.
    Review the info here About Installation Options.
    Specifically; "About Upgrade to Mac OS X
    Upgrading to Mac OS X takes a little longer than installing it on a volume without Mac OS X, but it is the least intrusive way to install--most of your existing settings and applications are left untouched during an upgrade. In other words, you won't have to configure a lot of settings afterwards."
    ali b

  • Database Upgrade using Data Pump

    Hi,
    I am moving my database from a Windows 2003 server to a Windows 2007 server. At the same time I am upgrading this database from 10g to 11gR2(11.2.0.3).
    therefore I am using the export / import method of upgrade ( via Data Pump not the old exp/imp ).
    I have successfully exported by source database and have created the empty shell database ready to take the import. However I have a couple of queries
    Q1. regarding all the SYSTEM objects from the source database. How will they import given that the new target database already has a SYSTEM tablespace
    I am guessing I need to use the TABLE_EXISTS_ACTION option for the import. However should I set this to APPEND, SKIP, REPLACE or TRUNCATE - which is best ?
    Q2. I am planning to slightly change the directory structure on the new database server - would it therefore be better to pre-create the tablespaces or leave this to the import but use the REMAP DATAFILE option - what is everyone's experience as to which is the better way to go ? Again if I pre-create the tablespaces, how do I inform the import to ignore the creation of the tablespaces
    Q3. these 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump file to the new server and then the import, I could use a Network Link for Import. I was just wondering where there any con's of this method over using the explicit export dump file ?
    thanks,
    Jim

    Jim,
    Q1. regarding all the SYSTEM objects from the source database. How will they import given that the new target database already has a SYSTEM tablespace
    I am guessing I need to use the TABLE_EXISTS_ACTION option for the import. However should I set this to APPEND, SKIP, REPLACE or TRUNCATE - which is best ?If all you have is the base database and nothing created, then you can do the full=y. In fact, this is probably what you want. The system tablespace will be there so when Data Pump tries to create it , it will just fail that create statement. Nothing else will fail. In most cases, your system tables will already be there, and this is ok too. If you do schema mode imports, you will miss out on some of the other stuff.
    Q2. I am planning to slightly change the directory structure on the new database server - would it therefore be better to pre-create the tablespaces or leave this to the import but use the REMAP >DATAFILE option - what is everyone's experience as to which is the better way to go ? Again if I pre-create the tablespaces, how do I inform the import to ignore the creation of the tablespacesIf the directory structure is different (which they usually are) then there is no easier way. You can run impdp but with sqlfile and you can say - include=tablespace. This will give you all of the create tablespace commands in a txt file and you can edit the text file to change what ever you want to change. You can tell datapump to skip the tablespace creation by using - exclude=tablespace
    Q3. these 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump file to the new server and then the import, I could use a Network Link for Import. I >was just wondering where there any con's of this method over using the explicit export dump file ?The only con could be if you have a slow network. This will make it slower, but if you have to copy the dumpfile over the same network, then you will still see the same basic traffic. The pros are that you don't have to have extra disk space. Here is how I look at it.
    1. you need XX GB for the source database
    2. you need YY GB for the source dumpfile
    3. you need YY GB for the target dumpfile that you copy
    4. you need XX GB for the target databse.
    By doing network you get rid if 2*YY GB for the dumpfiles.
    Dean

  • FI Master data upload sequence

    Experts,
    Please excuse me if my post sounds vague. I am a Technical Consultant working off-site independently. I couldn't think of a better forum to get functional advice.
    Please look at the below list of FI Master data upload objects, and suggest:
    a. the recommended sequence in which the objects are to be uploaded
    b. points to consider/tips for uploading/recommendations (LSMW/BUS*/BAPIs, etc)
    FI Objects:
    1. Cost Centers
    2. Profit Centers
    3. GL Accounts- COA,CC
    4. Fixed Assets
    5. Bank Data
    6. Cost Elements
    7. Activity Groups
    8. Activity Types
    9. Activity Prices
    Thanks in advance.
    NW

    Hi,
    we have to upload first the Master such as G/l Accounts , Primary Cost Elements & Secondary cost elements.then upload the open items , then p/l balances, then B/S balances , finally asset master with balances
    Pre-Check LSMW: To-Do-List Before Using the Tool
    The LSM Workbench is a tool that supports the transfer of data from non-SAP systems to R/3. Basic functionality of the LSM Workbench:
    1. Import old data from spreadsheet tables or sequential files.
    2. Convert the data from source format into target format.
    3. Import the data into R/3.
    Before using the LSM Workbench, you need a concept for data migration. In particular, note the following items:
    · Make sure that R/3 customizing is complete.
    · Analyze the data existing in the legacy system to determine which data will be needed in the future (also from a business-operational point of view).
    · Consider whether usage of the tool makes sense with regard to the data volumes to be transferred. In case of very small data quantities, it may be easier to carry out the transfer manually. With very large data volumes, however, batch input technology may lead to excessively long runtimes. Rough estimate for the required time: 10000 records per hour; this value, however, may vary strongly depending on the hardware.
    · Identify the transaction(s) in R/3 you want to use for bringing the data into the SAP System. Here, it may also be relevant whether the data is required for statistical (evaluation) purposes or for further processing in the system.
    · Test the relevant transaction in R/3 manually with test data from the old system and make sure that all required fields are filled. There may be required fields that do not correspond to any data window in the legacy system. In such a case, assigning a fixed value or defining the field as optional for data transfer may be appropriate.
    · Check the interfaces provided by the application. Is there a batch input program and an IDoc (for example)? Which method should be used in your project?
    · Develop a mapping plan in written form: Assign the legacy system fields to the R/3 fields.
    · Determine the form (e.g. via u201EMOVEu201C or assigned according to a rule) in which the legacy system data shall be transferred to the SAP System.
    · If applicable, define the allocation rules (LSM-internal name: u201Etranslation rulesu201C).
    · Define the way for extracting the data from the legacy system. (Note: LSMW does not extract data.)
    · Describe the form in which the old data are available: Will the host or the spreadsheet interface of the LSMW have to be used?
    · If only a part of your legacy system will be replaced by R/3, define the functionalities to be provided by the SAP System and those to be provided by the legacy system. If required, create a concept of the data flows and interface architecture.
    These questions can only be answered individually for every customer; as a matter of course, this should be done before the tool is used!
    SKS

  • Hr master data uploading

    i try to upload HR master data thrugh BDC.but in the infotype 0008(in basic pay) its showing all the values except annual salary,its showing the  error like internal error occurred while calling the function module rp_ansal_from_wagetypes.

    Hi,
    Passing the Data and Uploading wagetype data strcture should be the same.
    And if you have not configured the Wage Types also, you will get the error.
    Plz check the wage types. And check the Flat file structure aginst the P0008.
    Reward points if helpful.
    Regards,
    N.L.

  • How to undo data upload in BPC

    Hi All,
    I have inadvertently uploaded "Actual" data file in BPC with the wrong year, i.e. the Actuals data file should have 2008 as the year, but instead it has 2009 as the year.
    Please suggest what would be the best way to fix it.
    Is there any way to delete Transactions data in BPC?
    I also tried to look the option of using journal for fixing this issue, but every time I try to open an existing or new journal, I get the message that I do not have any authorization for this module.
    I am working in a sandbox /Demo enviornment with only one user, and that user is the Primary Admin/System Admin and has all accesses.
    Can anyone throw some light as to why I might be getting this error message.
    Thanks in advance.
    Best Regards.

    Hi Anurag,
    For deleting the existing data goto datamanger-Run datamanger- select clear function from data manager package.
    Delete your existing data based upon the parameters.
    Incase of journal posting you dont have sutorizatiion, check Task profile in your security settings.
    Ensure creation of journal wizard before going forward the journal.
    For your current issue journal posting is not correct option.
    Regards
    Naveen.KV

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

Maybe you are looking for

  • BT Infinity installation nightmare

    I have ordered BT Infinity + Phone Line + Unlimited Calls, which involves transferring my phone line from TalkTalk. Here is the letter of complaint I am currently penning to BT which expains my predicament...  Any Help from BT on here to get this res

  • Batch Snapshot needed

    Is there any way to automate snapshots for multiple images. I am a travel and event photographer and it is not unusual for me to have 100+ final images from a job. Remembering to save a snapshot of each image while I am editing is time consuming and

  • Inheritance in GOOP?

    Is there a way to accomplish inheritance in GOOP? I'm trying to abstract out some of my hardware control, and I've got a couple different types of measurement systems. Any help would be appreciated. Thanks, Kevin Mescher

  • Keep star rating after export

         I can't seem to keep my star ratings on my files after I export them to JPG. My export settings are set to keep "All" metadata, but this doesn't seem to change it. Any ideas of what else might be happening? Thanks for your help.

  • X3-02 connecting to facebook

    Using both wifi and wap the phone connects to the facebook site fine. I click on the username section to bring up Nokias text box, type my email and press ok to paste it to the username box. Nothing appears, same happens for the password. Phone has t