Data mart status has beed reset by del the in between req in source ODS

Hi SDN,
We have a situation where in which there is a daily delta laod going from ODS to Cube and we have reset data mart status accidentally by deleting the request which is in between multiple number of requests in the source ODS, now when we deleted the requests in ODS we got a POP asking for 'do you want to delete the data in cube' we went with on that POP up, after that when we see in the manage of the source ODS the datamart status is not seen for all the requests that have been loaded to target cube. Then we have reconstructed the data for the deleted request from the PSA.
Next day when a new laod comes into ODS then will the ODS send the correct delta update to target Cube.
If correct delta is not updated to cube is there any method that we can follow to maintain the data consistency with out deleting the data in the target cube.
Thank you,
Prasaad

Hi,
You dekleted Data in Cube and loaded,but Ddata Mart is not appearing in ODS. So if you have all data in ODS, delete data in Cube and then just delete datamart layer in ODS and right click and click on Update data in DataTarget, then fresh request will update to Cub , this is Init only then Deltas will go from net day onwards.
Thanks
Reddy

Similar Messages

  • Get back the Data mart status in ODS and activate the delta update.

    I got a problem when deleting the requests in ODS.
    actually there is Cube(1st level. it gets loaded from an ODS(2nd level). this gets loaded from 3 ODS'S( 3rd level). we were willing to delete recents requests from all the data tardets and reload from PSA. but while delting in the request in ODS(2nd level), it has displayed a window, showing as follows.
    - the request 132185 already retrived by the data target BP4CLT612.
    -Delta update in BP4CLT612 must be deactivated before deleting the request.
    - Do you want to deactivate the delta update in data target BP4CLT612.
       I have clicked on execute changes in the window. it has removed the data mart status for all the request which i have not deleted.
    in the same it happened inthe 3 ODS's(3rd level).
    I got clear that if we load further data from source system. it will load all the records from starting.
    so to avoid this can any body help me how to reset the Data mart status and activate the delta update.

    Hi Satish,
    U have to make the requests RED in cube and back them out from cube,before u can go for request deletions from the base targets(from which cube gets data).
    Then u have to reset data mart status for the requests in your 'L2 ODS' before u can delete requests from ODS.
    Here I think u tried to delete without resetting data mart status which has upset the delta sequence.
    To correct this..
    To L2 ODS,do an init without data transfer from below 3 ODS's after removing init request from scheduler menu in init infopackage.
    Do similar from L2 ODS to Cube.
    then reconstruct the deleted request in ODS.It will not show the tick mark in ODS.Do delta load from ODS to Cube.
    see below thread..
    Urgentt !!! Help on reloading the data from the ODS to the Cube.
    cheers,
    Vishvesh

  • Data Mart Status of the request is not ticked

    Hello Everybody,
    I am the first time to deal with BW.
    After I loaded the data to the ODS, the request in the manage data view was not ticked as the others although the job was completed successfully.
    Does anyone can help?
    Many Thanks
    F-B-I

    Hi,
    have you deleted the data mart status in ods .
    u need to follow these steps
    1, if you chenged the status to red no need to delete the datamart status in ods .you can load the data from ods to infocube.
    2, if the status green and need to delete the datamart symbol in ods.
    regards
    sivaraju

  • ITunes network connection has been reset: Anyone know the meaning?

    Within the past 2 days I have started receiving a "Network connection has been reset" error message when trying to purchase new apps or updating already purchased apps. Does anyone have better insight than I into its meaning and purpose? Thanks!

    +I have dial up so i don’t know if this might be causing the problem, is there anyway to fix this?+
    Sounds like the dial=up connection is dropping. My advice is to get it from a friend's house that has high speed internet. Put the iTunes Setup.exe file on a USB memory stick, take it to your PC and drag it to the desktop. Then run the install.

  • TS3276 I can't send an e-mail.  My status has mail boxes online, but the mail preferences keeps changing outgoing servers to (offline). What do I do?

    I can't send an e-mail. My status has mail boxes online, but but the mail preferences changes the outgoing servers to (offline).  What do I do?

    Please describe in detail what happened when you took all the applicable steps in the support article that led you here. That's the starting point for any further efforts to solve the problem.

  • Can I see who loaded PL00 data once status has changed?

    Hello there,
    The SEM-BCS monitor allows you to see who last changed the status of a task (i.e. loaded, locked etc.).
    Does anybody know how we can show who last loaded data (in particular PL00 data) once the task for this load has been locked (and the status details thereby only show who locked the task)?
    Thanks and regards
    IM

    Ehm, actually I have no clue, but I recommend to read the HowTo "Trace changes of BCS Masterdata" and use the principle on this topic.
    Browse to http://www.service.sap.com/solutions
    -> Solution Details -> Business Solutions and
    Applications -> SAP Business Suite -> SAP ERP -> SAP ERP Analytics -> SAP Strategic Enterprise Management (SAP SEM) -> Business Consolidation -> Media Library -> How Tos
    and there is the HowTo for tracing changes on master data.
    maybe this will help.
    BR

  • SAP GRC AC 5.3 SP09 Data Mart: How to set up this new feature

    Hi there,
    does anyone already has some experience in setting the new data mart functionality in AC 5.3 SP09?
    I have read through the data mart config section of AC 5.3 config guide pages 55-56 and 321-322 but don´t understand how this process should work in the simpliest way to receive simple flat files of out AC 5.3.
    We do not use Crystal Reports but we just want to have flat file data extracts out of the AC 5.3 and load them into MS Access.
    1. Regarding the creation of a new data source in the Visual Admin: Is this required when I just want to get flat files out of AC 5.3 using the data mart jobs?
    We use a Oracle DB in a dual stack install for the AC 5.3. Do I need to install a JDBC Oracle driver for setting up the data source?
    2. Where am I supposed to find the data mart flat files that would result out of the data mart jobs when I do not have any connection set up to certain analysis tool DB? Do I need to define a file location in the data mart jobs in RAR?
    Thanks for any help and regards,
    Markus

    Dear all,
    got a reply from SAP on a message regarding this issue stating that the connection process outlined in the document
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/10e245d3-09a5-2c10-3296-f3afe44497e2&overridelayout=true
    can be applied as well to Oracle DB if your GRC AC 5.3 data is stored in there. Prereq is to add the additional data source in your visual admin.
    Via the data mart jobs you then need to fill the data mart and via ODBC for Oracle you should be able to access this data using Crystal Reports or MS Access.
    Regards,
    Markus

  • Oracle BPM Process Data mart

    I am required to create audit reports on BPM workflows.
    I am new to thid & need some guidance on configuring BPM Process Data mart. What are the pre-requisites for configuring it & what are the steps to do it.
    Also, need some inputs on BAM database. What is the frequency of data upload. Is it data update or insert in BAM.

    Hi,
    You might want to check out the Administration and Configuration Guides on http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/index.html.
    I suspect you might find the BAM and Data Mart portions of this documentation a bit terse, so I've added the steps below that provides more detail. I wrote this for ALBPM 6.0, but believe it will still work for Oracle BPM 10g. It was created from an earlier ALBPM 5.7 document Support wrote called "ALBPM 5_7 Configuring and Troubleshooting the BAM and DataMart Updater.pdf.
    You can define how often you want the contents in both databases updated (actually inserted) and how long you want to persist the contents of the BAM database during the configuration.
    Here's the contents of the document:
    1. Introduction
    The use of BAM (Business Activity Monitoring) and Data Mart (or Warehouse) information is becoming more and more widespread in today’s BPM project implementations for the obvious benefits they bring to the management and tuning of processes.
    BAM is basically composed by a collection of measurements of current processes load and execution times. This gives us an idea of how the business is doing at this moment (in a pseudo real-time fashion).
    Data Mart, on the other hand, is a historical view of the processes load and execution times. And this gives us an idea of how the business has developed since the moment the projects have been put in place.
    In this document we are not going to describe exhaustively all configuration aspects of the BAM and Data Mart Updater, but rather we will quickly move from one configuration step to another paying more attention to subjects that have presented some difficulties in real-life projects.
    2. Creating the Service Endpoints
    The databases for BAM and for Data Mart first have to be defined in the External Resources section of the BPM Process Administrator.
    In this following example the service endpoint ‘BAMJ2EEWL’ is being defined. This definition is going to be used later as BAM storage. At this point nothing is created.
    Add an External Resource with the name ‘BAMJ2EEWL’ and, as we use Oracle, select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMBAM’. This user, and its database, will be created later
    ·     the password for the user
    Scroll down to the bottom of the page and click <Save>.
    In addition to a standard JDBC connection that is going to be used by the Updater Service, a remote JDBC configuration needs to be added as the Engine runs in a WebLogic J2EE container. This Data Source is needed to grant the Engine access over BAM tables thru the J2EE Connection Pool instead of thru a dedicated JDBC. The following is an example of how to set this up.
    Add an External Resource with the name ‘BAMremote’ and select the Oracle driver, then click <Next>
    On the following screen, specify:
    ·     the Lookup Name that will be used subsequently in WebLogic - here I have given it the name ‘XAbamDS’
    Then click <Save>.
    In the next example the definition ‘DWHJ2EEWL’ is created to be used later as Data Mart storage. If you are not going to use a Data Mart storage you can skip this step.
    Add an External Resource with the name ‘DWHJ2EEWL’ and select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMDWH’. This user, and its database, will be created later
    ·     the password for the user
    3. Configuring BAM Updater Service
    Once the service endpoint has been created the next step is to enable the BAM update, select the service endpoint to be used as BAM storage and configure update frequency and others. Here the “Updater Database Configuration” is the standard JDBC we configured earlier and the “Runtime Database Configuration” is the Remote JDBC as we are using the J2EE Engine.
    So, here’s the example of how to set up the BAM Updater service….
    Go into ‘Process Monitoring’ and select the ‘BAM’ tab and enter the relevant information (using the names created earlier – use the drop down list to select):
    Note that here, to allow me to quickly test BAM reporting, I have set the update frequency to 1 minute. This would not be the production setting.
    Once the data is input, click <Save>.
    We now have to create the schema and related tables. For this we will open the “Manage Database” page that has appeared at the bottom of the BAM screen (you may have to re-select that Tab) and select to create the database and the data structure. The user required to perform this operation is the DB system administrator:
    Text showing the successful creation of the database and data structures should appear.
    Once we are done with the schema creation, we can move to the Process Data Mart configuration screen to set up the Common Updater Service parameters. Notice that the service has not been started yet… We will get to that point later.
    4. Configuring Process Data Mart Updater Service
    In the case that Data Mart information is not going to be used, the “Enable Automatic Update” checkbox must be left off and the “Runtime Database Configuration” empty for this service. Additionally, the rest of this section can be skipped.
    In the case it is going to be used, the detail level, snapshot time and the time of update should be configured; in addition to enabling the updater and choosing the storage configuration. An example is shown below:
    Still in ‘Process Monitoring’, select the ‘Process Data Mart’ tab and enter the name created earlier (use the drop down list to select).
    Also, un-tick the Generate O3 Cubes (see later notes):
    Then click <Save>.
    Once those properties have been configured the database and the data structure have to be created. This is performed at the “Manage Database” page for which the link has appeared at the bottom of the page (as with BAM). Even when this page is identical to the one shown above (for the BAM configuration) it has been opened from the link in the “Process Data Mart” page and this makes it different.
    Text showing the successful creation of the database and data structures should appear.
    5. Configuring Common Updater Service Parameters
    In the “Process Data Mart” tab of the Process Monitoring section -along with the parameters that are specific to the Data Mart - we will find some parameters that are common to all services. These parameters are:
    • Log directory: location of the log file
    • Messages logged from Data Store Updater: severity level of the Updater logs
    • Language
    • Generate Performance Metrics: enables performance metrics generation
    • Generate Workload Metrics: enables workload metrics generation
    • Generate O3 Cubes: enables O3 Cubes generation
    In this document we are not going to describe in detail each parameter. But we will mention a few caveats:
    a. the Log directory must be specified in order for the logs to be generated
    b. the Messages logged from Data Store Updater, which indicates the level
    of the logs, should be DEBUG for troubleshooting and WARNING otherwise
    c. Performance and Workload Metrics need to be on for the typical BAM usage and, even when either metric might not be used on the initial project releases, it is recommended to leave them on in case they turn out to be useful in the future
    d. the Generation of O3 Cubes must be off if this service is not used, otherwise the Data Mart Updater service might not work properly .
    The only changes required on this screen was to de-select the ‘Generate O3 Cubes’ as shown in the last section.
    6. Set up the WebLogic configuration
    We need to set up the JDBC data source specified above, so go to Services / JDBC / Data Sources.
    Click on <Lock and Edit> and then <New> to add a New data source.
    Specify:
    ·     the Name – use the name you set up in the Process Administrator
    ·     the JNDI Name – again use the name you set up in the Process Administrator
    ·     the Database Type – Oracle
    ·     use the default Oracle Database Driver
    Then click <Next>
    On the next screen, click <Next>
    On the next screen specify:
    ·     the Database Name – this is the SID – for me that is XE
    ·     the Host Name – as I am running on my laptop, I’ve just specified ‘localhost’
    ·     the Database User Name and Password – this is the BAM database user specified in the Process Administrator
    Then click <Next>
    On the next screen, you can test the configuration to make sure you have got it right, then click <Next>
    On the next screen, select your server as the target server and click <Finish>:
    Finally, click <Activate Changes>.
    7. The Last Step: Starting Up and Shutting Down the Updater Service
    ALBPM distribution is different depending on the Operating System. In the case of the Updater Service:
    -     For Unix like Operating Systems the service is started or stopped with the albpmwarehouse.sh shell script. The command in this case is going to look like this:
    $ALBPM_HOME/bin$ ./albpmwarehouse.sh start
    -     For Windows Operating Systems the service is installed or uninstalled as a Windows Service with the albpmwarehouse.bat batch file. The command will look like:
    %ALBPM_HOME%\bin> albpmwarehouse.bat install
    After installing the service, it has to be started|stopped from the Microsoft Management Console. Note also that Windows will start automatically the installed service when the computer starts. In either case the location of the script is ALBPM_HOME/bin Where ALBPM_HOME is the ALBPM installation directory. An example will be:
    C:\bea\albpm6.0\j2eewl\bin\albpmwarehouse.bat
    8. Finally: Running BAM dashboards to show it is Working
    Now we have finally got the BAM service running, we can run dashboards from within Workspace and see the results:
    9. General BAM and Data Mart Caveats
    a. The basic difference between these two collections of measurements is that BAM keeps track of current processes load and execution times while Data Mart contains a historical view of those same measurements. This is why BAM information is collected frequently (every minute) and cleared out every several hours (or every day) and why Data Mart is updated infrequently (once a day) and grows indefinitely. Moreover, BAM measurements can be though of as a minute-by-minute sequence of Engine Events snapshots, while Data Mart measurements will be a daily sequence of Engine Events snapshots.
    b. BAM and Data Mart table schemas are very similar but they are not the same. Thus, it is important not to use a schema created with the Manage Database for BAM as Data Mart storage or vice-versa. If these schemas are exchanged by mistake, the updater service will run anyway but no data will be added to the tables and there will be errors in the log indicating that the schema is incorrect or that some tables could not be found.
    c. BAM and Data Mart Information and Services are independent from one another. Any of them can be configured and running without the other one. The information is extracted directly from the Engine Database (PPROCINSTEVENT table is the main source of info) for both of them.
    d. So far there has not been a mention of engines, projects or processes in any of the BAM or Data Mart configurations. This is because the metrics of all projects published under the current Process Administrator (or, more precisely, FDI Directory) are going to be collected.
    e. It is also important to note that only activities for which events are generated are going to be measured (and therefore, shown in the metrics). The project default is to generate events only for Interactive activities. This can be changed for any particular activity and for the whole process (where the activity setting, when specified, overrides the process setting). Unfortunately, there is no project setting for events generation so far; thus, remember to edit the level of event generation for every new process that is added to the project.
    f. BAM and Data Mart metrics are usually enriched with Business Variables. These variables are a special type of External Variables. An External Variable is a process variable with the scope of an Instance and whose value is stored on a separate column in the Engine Instances table. This allows the creation of views and filters based on this variable. A Business Variable, then, shares all the properties of an External Variable plus the fact that its value is collected in all BAM and Data Mart measurements (in some cases the value is shown as it is for a particular instance and in others the value is aggregated).
    The caveat here is that there is a maximum number of 256 Business Variables per FDI. Therefore, when publishing several projects into a single FDI directory it is recommendable to reuse business variables. This is achieved by mapping similar Business Variables of different projects with a unique real Variable (on the variable mapping performed at publish time).
    g. Configuring the Updater Service Log
    In section 5. Configuring Common Updater Service Parameters we have seen that there are two common Updater properties related to logging. These properties are “Log directory” and “Messages logged from Data Store Updater”, and they specify the location and level of these two files:
    - dwupdater.log: which is the log for the Data Mart updater service
    - bam-dwupdater.log: which is the log for the BAM updater service
    In addition to these two properties, there is a configuration file called ‘WarehouseService.conf’ that allows us to modify these other properties:
    - wrapper.console.loglevel: level for the updater service log
    - wrapper.logfile.loglevel: level for the updater service log
    - wrapper.java.additional.n: additional argument to the service JVM
    - wrapper.logfile.maxsize: maximum size of the updater service log files
    - wrapper.logfile.maxfiles: maximum number of updater service log files
    - wrapper.logfile: updater service log file name (the default value is dwupdater-service.log)
    9.1. Updater Service Log Configuration Caveats
    a. The first three parameters listed above have to be modified when increasing the log level to DEBUG (since the default is WARNING). The loglevel parameters have to be set to DEBUG and a java.additional.n (where n is a consecutive integer to the already used ones) has to be set to –ea to enable asserts, since without this option no DEBUG message is going to be generated.
    b. Of the other arguments, maxfiles might need to be increased to hold a few more days of data when the log level is set to DEBUG (with the default value up to two days are stored).
    c. The updater service has to be stopped, uninstalled, installed and then started for any of these changes to take effect.
    Hope this helps,
    Dan

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • OIM11gR1 - Trusted recon event going to Data Received status

    Hi All,
    I am trying to do a trusted recon for an existing user in OIM and the recon event always goes to Data Received status.
    Everything looks good on the data side and no exceptions in the logs. It's almost 10 times that every recon event is stuck in Data Received status.
    Anybody had similar issues ? Please suggest any solution....
    Thanks..

    If you have a lot of events, they will process when they get to the batch amount specified in your system configuration.  If you are doing individual recon events with code, you will want to make sure to use the callingEndOfJobAPI() to complete the event.  If your scheduled task is not completing successfully, it might not be triggering the processing at the end.  And also, if the data is extremely large, you might have a stuck thread in the batch processing which has caused it to fail to process.
    -Kevin

  • The Date Accessed attribute is resetting to all the files in a folder

    Hi all,
    I have an issue that the "Date Accessed" attribute is often resetting to all the files in a folder if i open single file as this folder is from File server mapped as Drive to me. This is happening to all the users whoever connect to that fileserver
    and i see same issue from RDP as well. is there any option to check which is resetting the "Date Accessed" attribute to all the files. The Date Access attribute is simillar to Date accessed and Date modified. We need to fix this issue as it is related
    to security concern for us. The file server is windows 2008 R2 and the client machines are windows 7.
    Side note: I can see that there is no software, antivirus, offline sycn is causing this issue.
    Thank you,
    Sampath
    P.Sampath

    Hi,
    It seems that some specific process accessed the files and updated the attribute. You could configure auditing on the shared folder. Then you can go through the auditing log to check which process is doing the read operation on all these files. 
    Configuring Audit Policies
    http://technet.microsoft.com/en-us/library/dd277403.aspx
    Best Regards,
    Mandy
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Travel Expense Manager (PR05) "Add Data Maintain:Status" upon SAVE

    Hi TM Gurus,
    Good day.
    In PR05,  The"Add Data Maintain: Status" Screen  appears upon saving with the default Settlement Status set to "Open".  Do you happen to know if this default value can be changed via configuration to "To be settled"?   If yes, can you provide steps on how to do this?
    Thank you very much.
    Best regards.

    Customising in TRVPA  via PE03 also should help you achieve this as you have option for "Open" or "To be Settled!
    Entry "WRP" for the SAP EP interface:
    The entry "WRP" allows you to control the dialog for saving a trip. The
    settings apply to the travel services in the SAP EP interface. With some
    statuses, a distinction must be made between the process steps Temporary
    Save and Save and Send.
    Possible values:
    o   0 =  Trip saved temporarily -> "Trip completed" + "Open"
                 Trip saved with "Save and Send" ->  "Trip completed" + "To
         be settled"
    o   1 = Trip saved temporarily -> "Request Entered" + "Open"
                Trip saved with "Save and Send" -> "Request entered" + "To be
         settled"
    o   2 = Trip saved temporarily -> "Request entered" + "Open"
                Trip saved with "Save and Send" -> "Request approved" + "To
         be settled"
    o   3 = Trip saved temporarily -> "Trip completed" + "Open"
               Trip saved with "Save and Send" -> "Trip completed" + "To be
         settled"
    o   4 = Trip saved temporarily -> "Trip completed" + "Open"
                       Trip saved with "Save and Send" -> "Trip approved" +
         "To be settled"
    o   5 =  Trip in the future (end date later than creation date)
                  -> Trip saved temporarily -> "Request entered" + "Open"
                 Trip saved with "Save and Send" -> "Request entered" + "To
         be settled"
                 Trip in the past (end date earlier or same as creation date)
                 -> Trip saved temporarily -> "Trip completed" + "Open"
                 Trip saved with "Save and Send" -> "Trip completed" + "To be
         settled"

  • Travel Expense Manager (PR05): Trigger "Add Data Maintain:Status" upon SAVE

    Hi TM Gurus,
    Good day.
    The "Add Data Maintain: Status" Screen can be manually accessed by clicking the "Trip Status" Button in PR05.
    Can you kindly confirm if it's possible to automatically trigger the "Add Data Maintain:Status"  Screen via configuration when clicking the save button in Transaction PR05 - Travel Expense Manager?
    Thank you very much.
    Best regards.
    Rainnier

    Hi Sally,
    Good day.
    I will now tagged this as answered.  Thanks again for your help on this.
    On the other hand, I have another question:  The"Add Data Maintain: Status" Screen now appears but with the default Settlement Status is "Open".  Do you happen to know if this can be changed via configuration to "To be settled"? 
    Can you please answer this ar this link: Travel Expense Manager (PR05) "Add Data Maintain:Status" upon SAVE
    Thank you very much.

  • Data Sever Status Pending

    Hi all,
    I am using MII 12.0.2. I tried to connect SQL Server 2000 with MII it shows the data server status is Pending.
    I deployed the oracle-jdbc14.jar  in to the JDBC drivers.
    Server URL: jdbc:inetdae:<ServerName>:1433?database=Northwind&sql7=true
    JDBC Driver: com.inet.tds.TdsDriver
    Please help me how to avoid this problem.
    Thanks.
    John R

    John
    Point 1: You need to deploy JDBC driver for SQL Server 2000 and not the  oracle-jdbc14.jar . This is for Oracle Database, You Can get the Driver from [here|http://www.microsoft.com/downloads/details.aspx?familyid=9f1874b6-f8e1-4bd6-947c-0fc5bf05bf71&displaylang=en]
    Refer to Documentation Available there
    Point 2: Change the URL for SQL Server 2000 as jdbc:microsoft:sqlserver://<ServerName>:<PortNumber>;
    Point 3: Change the JDBC Driver to "com.microsoft.jdbc.sqlserver.SQLServerDriver"
    You can also follow [this Thread|SQL Server 2005 Connection Status Pending;,
    Hope this helps
    Regards
    Edited by: Rupesh Bajaj on Aug 8, 2008 3:06 PM

Maybe you are looking for

  • Bridge or Photoshop Elements?

    Greetings, I just bought Dreamweaver CS4. I've already been using Photoshop Elements 6 to manage my photos, so is there any benefit to using Bridge, in addition to Photoshop Elements? Thanks, Shane.

  • Automatic break of row in a table when content is too large to fit on one page?

    The content in some of the cells of a table is larger than the maximum space available on the page, with the result that the content at the bottom of the cell is not visible.  How do I tell Pages to automatically break the row to continue on the next

  • Auto-populate fields in Custom Object Based on Another Record

    Hello, I am hoping to auto-populate a few default values in a new Custom Object record based on an associated Lead record. The layout is this: I use the Lead record to take care of most of the information in the sales stages until the product is sold

  • Nothing is working after inserting Soundc

    When I insert my new X-fi Elite Pro 6L in my computer, I cant start the computer again. It stops when its trying to open Windows Vista. Get a BSODI tried to reinstall Windows Vista, but it wont even start the installation, just stops at a blank scree

  • Inter company process

    Hi , I have done intercompany transfer using following steps 1) Customer created with Sales Org and distn channel of Supplying company and assign to Receiving Plant 2) Supplier created with Receving company code and respective PORG and assign to supp