Transporting modified data target to production which has data?

Hi,
I have a cube for which i had added two more fields and changed the data model accordingly!
Now, i have a question:
In production this cube is filled daliy and now it has some data records in thousands!
what happenes if i transport this data model with out deleting the data in production?
Though we are in BI 7.0, i am using the 3.x datasources only!
Raj

Hi Raj,
What fields did you add?
Characteristics -->  Cube must be empty before transport. You will have to reload the cube.
Navigational Attrinutes --> no problem. You can even use them in your queries. they will be "filled".
Key-figures --> no problem, but no historical data. These fields will only be filled from the moment you have transported the update rules.
Success,
Udo

Similar Messages

  • Hi, i have inserted a dvd in the drive, which has data in it but its showing blank and asking toburn

    Hi, i have inserted a dvd in the drive, which has data in it but its showing blank and asking to burn. Along with that its also showing the capacity of the dvd. Its ready to write the disk but not exploring the data on the disk. Please help..

    See if this fixit tool can repair it
    http://support.microsoft.com/mats/cd_dvd_drive_problems/en-us

  • Error when executing interface which load data from csv file which has 320

    Hi,
    Can some one provide a resolution for below error:
    I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
    using LKM File to SQL, IKM Sql Control Append.
    I am getting below error when executing the interface :
    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
    BSF info: Create external table at line: 0 column: columnNo
         at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
         at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
         at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
         ... 11 more
    Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
    out.print("createTblCmd = r\"\"\"\ncreate table ") ;
    out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
    out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
              "<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
         , ",\\n\\t", "","")) ;
    out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
    out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
    out.print("'\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
    out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
    out.print("\n\t\tBADFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
    out.print("\n") ;
    if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    } else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
    out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
    out.print("'\n\t\t") ;
    if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
    } else {out.print("OPTIONALLY ENCLOSED BY '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
    out.print("' AND '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
    out.print("' ") ;
    }out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    }out.print("\tLOCATION (") ;
    out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
    out.print(")\n)\n") ;
    out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
    out.print("\nREJECT LIMIT ") ;
    out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
    out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
    ****** ORIGINAL TEXT ******
    createTblCmd = r"""
    create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
         <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
              "<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
         , ",\n\t", "","")%>
    ORGANIZATION EXTERNAL
         TYPE ORACLE_LOADER
         DEFAULT DIRECTORY dat_dir
         ACCESS PARAMETERS
              RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
              <%=odiRef.getUserExit("EXT_CHARACTERSET")%>
              <%=odiRef.getUserExit("EXT_STRING_SIZE")%>
              BADFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
              LOGFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
              DISCARDFILE     '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
              SKIP           <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
    <% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
              FIELDS
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%} else {%>
              FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
              <% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
              <%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%}%>     LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
    <%=odiRef.getUserExit("EXT_PARALLEL")%>
    REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
    # Create the statement
    myStmt = myCon.createStatement()
    # Execute the trigger creation
    myStmt.execute(createTblCmd)
    myStmt.close()
    myStmt = None
    # Commit, just in case
    myCon.commit().
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)

    The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
    Please see support Note [ID 1469977.1] for details.

  • How to create a .mdf SQL Server database from a Data-Tier Application file that has data?

    This is a noob question, though I do use SQL Server databases all the time with Entity Framework when I code in C# using Visual Studio 2013.  The development environment is found below at [A].  I am trying to make a clone of a SQL Server 2008 R2
    database (.mdf)  that exists online.  I can read, connect and work with this database in Visual Studio 2013, but I wish to make a local copy of the database, as an .MDF file.  Somewhere in my notes I have a way of creating a local copy from
    an online database when using Visual Studio but I forgot how (it seems, reviewing my notes, that it deals with ADO.NET which is deprecated in Visual Studio 2013 these days, or so it seems).  So I'm looking for another way.  What I did was create
    (or export) a "Data-Tier Application File" from the online SQL Server database, with data, and it seems to have worked in that this Data-Tier Application file exists on my hard drive and seems to have data in it ("SQL Server Replication Snapshot"
    is the format it seems).  It contains skeleton code to create a database, but when I tried to execute it with SQL Server 2014 Management Studio, I got a bunch of errors.
    So my question is:
    1) Can I somehow create a .MDF SQL Server Database from an Data-Tier Application file that has data?  What tool do I use?  I saw this link, http://social.technet.microsoft.com/wiki/contents/articles/2639.how-to-use-data-tier-application-import-and-export-with-a-windows-azure-sql-database.aspx 
    and it relates to Azure, but is there a tool for C#Visual Studio 2013, standalone?
    2) If there's an easy way to create a .mdf SQL Server Database file from an online file, within SQL Server Management Studio?  I don't think so, since it would require Administrator permissions on the online server, which I don't have. I have permission
    to read, update, delete the online database file, but strangely not to download it (the service I use has a tool for backup, but not for download).
    3) same question as 2), but for Visual Studio 2013?  I don't think so, since I notice none of the templates even mentions ADO.NET anymore, but instead they go with Entity Framework.  Using EF I can of course do anything I want with the online database
    (CRUD), but it remains online.  Maybe there's a switch to make a local copy?  I guess I could write a short program to suck all the data out of the online database and put it into a new, duplicate database having the same tables, that I create on
    my localhost, but my question here is if there's an easier way than this, maybe a tool or command I can run from inside Visual Studio?
    Any advice on any of the above questions is appreciated.
    Thank you,
    Paul
    [A] Microsoft Visual Studio Professional 2013
    Version 12.0.21005.1 REL
    Microsoft .NET Framework
    Version 4.5.51641
    Microsoft Web Developer Tools 2013   2.0.40926.0
    SQL Server Data Tools   12.0.30919.1
    Microsoft SQL Server Data Tools
    Windows Azure Mobile Services Tools   1.0
    Windows Azure Mobile Services Tools

    Thanks but these links are too general to help.
    "2. what do you mean by online file?" - I mean the SQL Server database file is on a remote web server that I rent from, but I am not the administrator of.  I can access my database using SQL Server Authentication, but nothing more.
    Paul
    What do you mean by too general? It explains on how you can use data tier application to create and deploy databases
    May be this will help you to understand better
    http://www.databasejournal.com/features/mssql/article.php/3911041/Creating-Data-Tier-Applications--in-SQL-Server-2008-R2.htm
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • I would like to build I data base in teststand which collect data from labview , if you have example (sequence) that give me some way to build sequence

    I would like to build a data base in teststand which collect data from labview program , if you have example (sequence) that give me some way to build sequence that have step of action for labview and step
    data base

    There is an example in the \TestStand\Examples\Database directory. Basically there are two ways to connect to your database.
    1. You can use the TestStand database step types. There are steps for opening/closing a database connection, opening/closing an SQL statement, and a step for performing a data operation.
    2. The other way is to use the TestStand database logging capability to write your results to the database. This is the way I would recommend. With database logging, you use the step results container to record all your results as the sequence runs. (This is done automatically by TestStand). When the sequence is complete, the process model calls a "Log to Database" sequence that will write the results to the database. You must define your datab
    ase schema using Configure->Database Options. There are some default (or example) schema definitions already defined. Refer to chapter 18 of your TestStand manual.
    Another way to log the data as the sequence is running is shown in the \TestStand\Examples\OnTheFlyReports. This has the advantage of recording data as it is obtained, but it is not as efficient in terms of using a database connection. I don't recommend using this method.
    Please post again if you have any more questions. If you are using stored procedures with your database, I can probably give you some tips.
    Mark

  • Can I transport changes thru Quality to Production without dropping data?

    Hi,
    I have data on a Quality system which is being used and it has become critical to extent that the Quality system is treated as Production.
    Now, there have been some changes in Development: 5 new key figures and 10 new characteristics. As a result, there have been modifications in the Infosource, update rules, ODS, Multiproviders, etc on Development.
    The plan is to transport these changes on Dev to Production and load data on Production for comparison against the data in Quality.
    (It may sound weird but that is the plan since there are some problems on Development)
    My question:
    Since the changes have to go through Quality to Production, will there be problem on the data on Quality?
    The goal is to freeze the data in Quality. Can the changes go through Quality without the need to drop the data in Quality?
    [Note that the objects are going to Production the first time: we do not overwrite any existing objects on Production]
    Thanks

    Hi,
    You need not do anything special while releasing the request.  If anything needs to be done, Basis should do it.
    For that matter, Basis should be able to copy the  CO and DAT files of a particular released request, to the import queue of any SAP system and import it there.  They would do that using operating system level commands.
    For us lesser folks, ther are above the table methods:-)
    If you go to transaction STMS in the development system and click on the 'red truck' (transport overview) you would be able to see the import queues of the D, Q and P systems.
    You can double-click on each system to see the requests.
    If you have released a change request, it would be exported and available in the import queues of the Q and P systems.
    If the request has been imported the status will show with a 'yellow triangle'.
    If the request is yet to be imported it would show with a 'green square'.
    In a particular import queue, you may choose the request that is not imported and import it into the system.
    Having said all this, it is quite possible that Basis may have configured the transport domain in a different manner.  It is possible that the Quality system is configured for Quality assurance.  That is why it is important to do this in association with the Basis team.
    Since the possibilities are many, a link is attached for further reading:
    http://help.sap.com/saphelp_nw04/helpdata/en/44/b4a09a7acc11d1899e0000e829fbbd/content.htm
    Mathew.

  • How to find from Data target filled by which  process chain.

    hi all.
    i have list of cubes i want to know that which cube is filled by which prcess chain ...
    can anyone suggest me appropriate solution.
    simply i want to know that when reconcilation which process chain affect on which bex report so that if one process chain got failed then i will not generate that specific report for that day..
    i have list of multiprovider on which we are generating rports and tech. name of report also i have.

    Ganesh ji,
    Choose your infopackage which holds the data targets, in the Infopackage screen reach the Scheduler tab and you will find the where-Used list. This will let you know the particular process chain responsible.
    thanks
    Prabhakaran
    09176665954

  • Get Content of PDF document which has data carrier SAP-SYSTEM

    Hi,
    I need to get the Content of Document in Binary format .The documtn is not in Content Server.
    I used the the below FM's to read the file name and its content.
    1. CV120_KPRO_MASTER_DATA_GET - for file name
    2. CV120_KPRO_CHECKOUT_TO_TABLE- To Read content
    For The document which has file attched in CV01N, I am able to get the data thrpugh this FM's. But, these document have no entry for FIle name in DRAW. I don't know why?
    And Some Documents dont have file attahced in CV01N.
    But, they have entry in DRAW for File. I don't know how these files are attched without entry in CV03N.
    like : DRAW-MRK_FILEP = ajhsaka.pdf
           DRAW-DDTRG1       = SAP-SYSTEM
    For these document, I am not able to read the Binary content using the Above mentioned FM's. It is giving Runtime error as FIle Cannot be opend. ot Not Found.
    Please let me know, how can solve my Problem.
    Thanks,
    Sandip

    Hi Sandip,
    as you have marked this question as solved, I would kindly ask you to post if you have found a suitable solution for your requirement and how it was realised. This will help all other forum users to get similar issues solved too.
    Best regards,
    Christoph

  • Posting only customer master IDOC which has data change

    Hello experts,
    I need some help urgently.
    We have a scenario where a third party system sending customer master data using IDOC's. The issue is the sending system is posting the entire customer master thru IDOC's . Now we want to pick only those IDOC's which has undergone change.
    Is there any standard way of doing this or i need to write code in user exit to compare data?
    Awaiting responses
    Thnx
    Harry

    Ya I agree. I think this solution is the only solution which we can think of now.
    Thanks !
    If any one has a different idea please post
    I am closing the thread

  • List of data targets is not visible during data upload

    Hi all,
      I am trying to load user defined transactional data into an info object, i will do all necessary customization steps such as creating application component,assiging data sources,creating info packages and then creating update rules in info cubes, moreover i wrote a routine which calculates sales reveune based on cost and quantity sold.
    My problem is that when i created infopackage , it does not list any data targets, Plz any one can give tips in this regard.
    thanks in advance
    regards,
    a.fahrudeen
    Message was edited by:
            FAHRUDEEN MUSTAFA

    Hi Fahrudeen,
    Am a little confused here... you say you want to load Transaction data and load it into the InfoObject?? what was that??
    You can load the Transaction data only into your data targets such as InfoCube and DataStore Objects... If you are loading the data into your InfoObjects, then that would mean that you are loading the Master data for which obviously you won't have your data targets listed in your InfoPackage... Only in case of loading the transaction data would you have your Data Targets listed in your InfoPackage...
    Regards
    Manick

  • Can GATP confimed date come from production order finishing date??

    Hi Expert,
    I have one question to ask.
    in the GATP , I want to the APO system can consider the production ability , that is, when I create a sales order, the system will make  a GATP check in APO, if there is no enough stock, the system will create a PPDS planned order and return the planner order finsihed date as the schedule line confirmed date.
    I do not know this is possible or not??
    I hope some expert can help me to solve this problem
    thank you in advance!

    Hi Liu,
    Why Planned order date.. It could be Production order date (planned order release date) + inhouse production time + Transporation lead time + GR processing time (at the original location where the order is placed) ..that should determine the material availability date. CTP exactly does this. Make sure that the "replenishment lead time is switched off" in the product-location master (in fact this is location independent)- as this take precedene over everything else.
    Also ensure that that end of checking horizon is larger than the interval established above, else MAD could be much earlier. Also enure that transfer of requirements is set both at header and schudule line level for CTP to work.
    Regards,
    Loknath

  • Addition of new field in existing cube which has data?

    I Have service provider field in ods and which is populating data on daily basis.
    Now I need to populate that field into cube and then reportingu2026u2026..
    For this do I need to delete the data from the cube and add that new field and need to load the last 6 month data? Or else with out doing deletion of data can I add a field in cube?
    Regards

    Remodeling is a new concept introduced in BI 2004s to manage changes to the structure effectively in an infoprovider, where the data is loaded and running. As of now we can make changes to the Infocube only. The same would be extended to the DSO and InfoObject in the future releases.
    Before you proceed to experience the remodeling process, letu2019s bear in mind few of the golden rules given by SAP.
    --> As a precaution, make a back-up of your data before you start remodeling.
    Before you start remodeling, make sure:
    --> You have stopped any process chains that run periodically and affect the corresponding InfoProvider.
    --> Do not restart these process chains until remodeling is finished. There is enough available table space on the database.
    --> After remodeling, check which BI objects that are connected to the InfoProvider (transformation rules, Multiproviders, queries and so on) have been deactivated. You have to reactivate these objects manually.
    You can go for remodelling. If you still have doubt, try the same in your Sandbox and repeat the process in Production as you are saying you have time of 1week.

  • Report with all Data Targets in Production and Record Count

    Hi,
    I am planning to create new report to handle some production maintenance work.
    How do I create a report with list of each ODS and Cube and count of Active record count? Can I use any report in BW statistics and modify based on my requirements?
    And also "Last Activation Date" is required.
    Any suggestions? Please do let me know.
    Thanks
    Anand.

    Hi,
    The load might be taking too mucjh time as the system is busy.
    wait for some time (30 mins) and see if the load goes thru
    The load might have strucked up.
    Change the request to red and reload data.
    after doing initwith out data load, Run Delta.
    Cheers,
    Srinath.

  • How to enter the data into data block text item which has an LOV associated

    Hi,
    I have a data block, one of the data block text item has an LOV assigned. when I populate this text item using the LOV and do "execute_query", it is taking the value in the text item and adding it to the search criteria, but when I enter a value manually in that text item and do "execute_query" it is showing me an alert(which I created) "Please enter a value".
    My question is, why is not taking the value that I enter manually? Looks like before executing the query, this field is getting empty
    How to avoid this problem and make sure that the value entered in the text item is added in the where clause of the query?
    Any advice?
    Thanks in advance
    R.G

    Problem solved!
    Before doing execute-query, all the text items are being cleared,so I used a global variable to store that value
    Thanks anyway
    R.G

  • May necessitate a new product, which has repeated failures, before 3 months of purchase?

    Hi, less than 2 months ago I bought a macbook pro retina display in an authorized apple store in chile.
    the computer has been restarted on several occasions, interrupting my work.
    when starting the computer displays the following information:
    "The computer has been restarted because of a problem."
    Today I've been to the store where I bought it.
    I had to deliver it for examination and diagnosis.
    I can order a new computer?, is a very expensive computer to fail so quickly. It is also my working tool,
    Thanks!!

    ah ok!, thank you very much for help, SIG
    anyway, I hope they deliver me my computer as soon as possible. =)
    I also hope that anyone who works at Apple, can answer my questions based on the laws of mac product warranty

Maybe you are looking for