Transformation between DS and DSO missing Non-cumulative KYF

Hi SDN,
I installed the business content for CML, and am working with 0CML_DELTA_CAP datasource.  I would like to create a transformation between this datasource and DSO 0CML_O03.
By default, when you install the CML business content, the datasources use the 3.x dataflow concept (transfer rules, infosources, and update rules).
I would like to use the 7.x dataflow concept and created a transformation between the 0CML_DELTA_CAP datasource and the 0CML_O03 DSO.  In the transformation, it is missing the fields 0CML_B330, 0CML_B340, 0CML_B360, 0CML_B380 in the datatarget.  These key figures are non-cumulative with NCum Value change (0CML_B330 uses 0CML_B331 as value change).  The value change key figures show up in the transformation, but the non-culmulative key figures do not. 
Does anyone have any ideas why the non-cuml. kyf are not showing up in the transformation?
Thanks,

Hi Brendon,
The non- cumulative key figures are not mapped in the transformation. Only the 'Inflow' and 'Outflow' key figures for the non- cumulative key figure are mapped.
You can check the property for the non- *** KF in RSD1, where you would find the corresponding inflow and outflow kf. If both of these are mapped in the transformation, the non- *** kf would calculate the value in the report in which it is included as:
Non-*** KF value = Inflow value - Outflow value.
Hope this helps.
Regards,
Abhishek.

Similar Messages

  • I cannot transform between Arabic and English language in writing

    I cannot transform between Arabic and English language in writing >>>> I should every time restart my browser >>>> i need a solution

    If you are talking about typing text then make sure that the language toolbar is visible.
    * http://windows.microsoft.com/en-US/windows7/The-Language-bar-overview
    In Firefox you can switch the bidi text direction via Ctrl+Shift+X.

  • Please write the difference between ODS and DSO...

    Hi all,
             Please write me the differences between ODS and DSO.. I think both are same in structure and have same update types (Over write, Addition) and also with the tables (changelog, Activation queue, Active table).. Is there still a difference between ODs and DSO...I can assign the points..
    thanks
    arya

    Hi,
    You can check the below document for new Features in BI:
    http://help.sap.com/saphelp_nw04s/helpdata/en/a4/1be541f321c717e10000000a155106/frameset.htm
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Reg
    Pra

  • Concept of Non-cumulative KYF

    Hi experts,
       We are using some non-cumulative KYFs in our FI customizing cubes. It really brings troubles for the project. Like:
       - Poor query performance
       - No special 0FISCPER value due to setting as 0CALDAY for the base time reference characteristic
       - No useful aggregates could be created as 0CALDAY can't be removed from any aggregate
        As so many troubles show up, I can't help wondering: where is the advantage?
        In our cube, Z1BALANCE is a non-cumulative KYF with inflow (0DEBIT) and outflow (0CREDIT). What is the difference if I set a general KYF Z2BALANCE to replace it and write code in update rules: result = 0DEBIT - 0CREDIT ?
        Would you please explain this in general words and examples instead of bunch of links?
    Regards,
    Aaron Wang

    Hi,
    At all  this is link will help you understand non-cumulative better.
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/41/b987eb1443534ba78a793f4beed9d5/frameset.htm
    Non cumulatives are calculated at the runtime of the that's why query performance is slow.
    The option is to compress the cube at regular time interval in case od non- cumulative.This will help in query performance improvement.
    I think you can give any time reference in the cube if you want other then 0calday that's you can do at the cube level.
    Ya aggregates cannot be created.
    you can use non- cumulative as well as cumulative key figure to the same.
    depends upon the scenarion which you are working on.
    If you data changes frequently then you should use cumulative key figures.
    But if it changes non- frequently then should use non- cumulative.
    The advantage is it gives you the values  with reference to the time there are cartian scenarion which can only be modeled through non - cumulative.
    Like inventory where at any point of time you can have just change in the values of the stock.
    i.e. always a delta brings in the change in the values of the invenctory.
    Inventory va;ues can be maintained in the form of inflows and outflows.That is waht is increasing the stocks and what is decreasing.
    Now if you just subtract( inflow- outflow) it will just give you the value at that aprticular time.
    This one page documnet explains everything.
    In our cube, Z1BALANCE is a non-cumulative KYF with inflow (0DEBIT) and outflow (0CREDIT). What is the difference if I set a general KYF Z2BALANCE to replace it and write code in update rules: result = 0DEBIT - 0CREDIT ?
    you can do it through cumulatives also.But that you will have to maintain at the ODS level and then subtract the key figures in the update rules it and maitain it in the cube.
    That will get stored with refernce to the time and will work well.
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/8f/da1640dc88e769e10000000a155106/content.htm
    Hope it helps
    Thanks

  • "Require authenticated binding between directory and clients" missing

    A new install of 10.6, upgraded to 10.6.4 Server does not show the option "Require authenticated binding between directory and clients" under the Open Directory - Settings - Policies - Binding tab. I do see the "Enable authenticated directory binding" option. Also, the terminal command sudo slapconfig -setmacosxodpolicy -binding required does work as a workaround.
    This option is not present in both Server Admin on the server itself and one client machines I'm using to manage the server.
    Is there a way to get the graphical button to become visible?
    Thank you in advance.

    This Option is no longer available in the GUI.
    See Apple Support Article HT4068.

  • Will iBooks sync between iPhone and iPad with non-iCloud Apple ID?

    I have two Apple IDs. One is an email address that I've used for all my iTunes purchases. The other is a .Mac account that I've had for I think ten years. The .Mac account has, of course, been converted to an iCloud account. I got an iPad recently and have bought some books from the iBooks store using my non-iCloud Apple ID. I have books on both my iPad and iPhone and have Sync Bookmarks and Sync Collections enabled on both devices. Both devices are connected to my iCloud account. But the two devices don't sync. Is this because my purchases were made with my non-iCloud account? Is there any way to get them to sync?

    I don't think you are missing anything...
    On the iOS devices tap Settings > iCloud. Switch Photo Stream off then back on then reset the devices.
    Hold the On/Off Sleep/Wake button and the Home button down at the same time for at least ten seconds, until the Apple logo appears.
    See if your photos sync now between the PC, iPad, and iPhone.

  • Data Reconciliation between PSA and DSO.

    Hi Experts,
    The data records which is showing in PSA is 1000 and when we ran to load the data  till DSO the records were only 600 and we dont understand why the records are not fetching.
    DTP Settings:
    Extraction mode is :FUll
    If anyone having the good documentation on How to design DSO please farward some good links.
    Regards.

    Hi,
    This is not issue with the DSo.
    There might be filters at various points due to which you are getting some of the records filtered.
    Check the following.
    Check the PSA for number of records if the records is not equal o 1000 then the records are filtered while fetching from source
    1)Check in PSA if there are any filters.
    If the records in PSA equal to 1000 then
    1)Check in DTP also for some filters
    2)Check the routines in transformation also if there are any records that are filtering in the transformation.
    Hope this helps,
    Sri...

  • Panel between Filmstrip and Viewer missing

    Hey all!
    I have Lightroom 4.0, and I'm editing pictures, but I have come to notice that the panel between the Filmstrip and the Viewer is missing. This was the panel that gave me the ability to check the "Show Mask Overlay" and set the Mask Pins to different display options (Auto, Hide, Show, etc.). I can't figure out how to get it back. Toggling all of the panels does not work.
    Thanks!

    Yes, it doesn't show up when I do click that. It doesn't show up under Crop (would have the 'Done" button) or any of the other adjustments either.

  • Transformation between  xfaform and document types

    Hi again,
    As I can understand that I cannot send an xfaform as an attachment to an email (but document is ok) I would like to know how I can transform an xfaform (with an xml schema to a document). Do I need to use the exportdata and importdata to do this?.
    It is important to me because I need to retrieve a form by email (in either xml/xdp or pdf) do some manual work in my process and return an email to the user with and answer in the attachment.
    Do you have any best practices for such a scenario?
    Sincerely
    Kim Christensen
    Dafolo A/S
    Denmark

    I tried using exportData operation; I defined its input variable as document and output as xfaform and I got the below InvalidCoercionException. Should I use setvalue operation instead?<br /><br />2007-12-05 19:56:13,795 INFO  [com.adobe.idp.scheduler.SchedulerServiceImpl] OneShot Trigger created ----------------------------------------<br />2007-12-05 19:56:13,811 ERROR [com.adobe.workflow.AWS] stalling action-instance: 508 with message: ALC-DSC-119-000: com.adobe.idp.dsc.util.InvalidCoercionException: Cannot coerce object: <document state="passive" senderVersion="3" persistent="true" senderPersistent="false" passivated="true" senderPassivated="true" deserialized="true" senderHostId="127.0.0.1/192.168.100.234" callbackId="0" senderCallbackId="52" callbackRef="null" isLocalizable="true" isTransactionBound="false" defaultDisposalTimeout="600" disposalTimeout="600" maxInlineSize="65536" defaultMaxInlineSize="65536" inlineSize="5896" contentType="null" length="-1"><cacheId/><localBackendId/><globalBackendId/><senderLocalBackendId/><senderGl obalBackendId/><inline><?xml version="1.0" encoding="UTF-8"?><xfa:datasets xmlns:xfa="http://www.xfa.org/schema/xfa-data/1....</inline><senderPullServantJndiName>ad obe/idp/DocumentPullServant/adobejb_server1</senderPullServantJndiName><attributes/></docu ment> of type: com.adobe.idp.Document to type: class com.adobe.idp.taskmanager.form.impl.xfa.XFARepositoryFormInstance<br />     at com.adobe.workflow.datatype.CoercionUtil.toType(CoercionUtil.java:878)<br />     at com.adobe.workflow.datatype.runtime.impl.pojo.POJODataTypeRuntimeHandler.coerceFrom(POJOD ataTypeRuntimeHandler.java:101)<br />     at com.adobe.workflow.datatype.runtime.impl.pojo.POJODataTypeRuntimeHandler.getNode(POJOData TypeRuntimeHandler.java:127)<br />     at com.adobe.workflow.dom.VariableElement.setBoundValue(VariableElement.java:93)<br />     at com.adobe.workflow.pat.service.PATExecutionContextImpl.setProcessDataValue(PATExecutionCo ntextImpl.java:729)<br />     at com.adobe.workflow.engine.PEUtil.invokeAction(PEUtil.java:583)<br />     at com.adobe.workflow.engine.ProcessEngineBMTBean.continueBranchAtAction(ProcessEngineBMTBea n.java:2863)<br />     at com.adobe.workflow.engine.ProcessEngineBMTBean.asyncInvokeProcessCommand(ProcessEngineBMT Bean.java:646)<br />     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)<br />     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)<br />     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)<br />     at java.lang.reflect.Method.invoke(Method.java:585)<br />     at org.jboss.invocation.Invocation.performCall(Invocation.java:345)

  • Problems moving between CS3 and CS4 - missing plugins

    Hello all. Sorry if this is a really obvious question. I'm still sort of new to InDesign.
    I'm working on a newsletter right now that I originally created using CS3. When the CS4 trial came out, I downloaded it, and the file it automatically transferred to opening on that version by default and I finished it up on CS4. Now my CS4 trial has run out and I don't want to buy it, but instead would like to keep working on the newsletter in CS3. When I try to open it in CS3 however, it says a ton of plugins are missing, namely:
    conditional text
    word ready
    appframework
    indexing
    xml
    generic page item
    package and preflight
    image
    master page
    incopy shared
    cjk text attributes
    assignments
    spread
    text
    links
    toc
    text attributes
    scripting
    hyperlinks
    document framework
    cjk grid
    workgroup
    text wrap
    whew! I'm pretty sure I have all the current updates. I tried looking for some of the plugins but I'm not really sure how to go about this or even if it's the right solution.
    Any help would be greatly appreciated! We wanted to get the newsletter to our printer before the holidays but that's not looking so great now.
    Thanks in advance!
    Alyssa

    You cannot open a CS4 file in CS3. If you'd like to email me the file,
    I'll save it back as an INX file for you. Depending upon what features
    you used in CS4 it may or may not be satisfactory...new features will be
    lost or badly mangled and text is quite likely to reflow.
    Bob

  • Cumulative and Non-cumulative KFs

    Guys,
             so I am confused about cumulative/Non-cumulative keyfigures.  I read what it says about them on help.sap.com
    Flow/non-cumulative value
    You can select the key figure as a cumulative value. Values for this key figure have to be posted in each time unit, for which values for this key figure are to be reported.
    Non-cumulative with non-cumulative change
    The key figure is a non-cumulative. You have to enter a key figure that represents the non-cumulative change of the non-cumulative value. There do not have to be values for this key figure in every time unit. For the non-cumulative key figure, values are only stored for selected times (markers). The values for the remaining times are calculated from the value in a marker and the intermediary non-cumulative changes.
    Non-Cumulative with inflow and outflow
    The key figure is a non-cumulative. You have to specify two key figures that represent the inflow and outflow of a non-cumulative value.
    For non-cumulatives with non-cumulative change, or inflow and outflow, the two key figures themselves are not allowed to be non-cumulative values, but must represent cumulative values. They must be the same type (for example, amount, and quantity) as the non-cumulative value."
    and also checked a few threads on sdn but still a little confused about their purpose.
    Can someone tell me when do we use which one?
    Thanks,
    RG.

    Hi Ram,
    Cummulative & Non Cummulative Key Figures :
    Basically the key figures are of two types cumulative and non cumulative.
    Cumulative are the normal one which you use always and they always bring the new values of the key figures in the delta that is if suppose A has value 10 and after change the new value is 20 then you will use cumulative key figures there, and your delta( new value) will bring 20.
    but suppose your key figures field only the change in the prior value that is in this case the delta in the key figure value will bring 10 (20-10- change of 10 )as new value in the key figure A then you will have to model it through the non cumulaitve key figures.
    Now
    1) Cumulative is for the first case that is if the key figure alwyas brings the new values of the key figure and not the change sin the key fiures value.
    2)NCum. value with NCUM value change:
    In this case ther is only one field which brings the changes for a particualr key figure and you ahve to define that key figure as non cumulative.
    Ex: In case of stock only one filed brings both ingoing value and outgoing value so 10 ,-4,100,-34.....
    In this case you will this option and use the key figure here in the space provided.
    3) In this case you haev two separate key figures one for the inflow of stocks and one for the outflow of the stocks.
    you use one key figure for the inflow and one key figure for the outflow.
    The main key figure autiomatically takes care of the logic and gives the correct output upon the summation
    net value of stocks( inflow- outflow).
    Also do remember in this case the key figure for inflow and out flow are the basic key figures that is cumulative key figures.
    A non-cumulative is a non-aggregating key figure on the level of one or more objects that is always displayed in relation to time. Examples of non-cumulatives include headcount, account balance and material inventory. here the aggregation of key figure is based on the another info object
    There are two different ways to define non-cumulative key figures:
    u2022 Non-cumulative key figure with non-cumulative changes:
    Before you can define the non-cumulative key figure, an additional cumulative key figure containing the non-cumulative change must exist as an InfoObject.
    u2022 Non-cumulative key figure with inflows and outflows
    There has to be two additional cumulative key figures as InfoObjects for non-cumulative key figures - one for inflows and one for outflows. The cumulative key figures have to have the same technical properties as the non-cumulative key figure, and the aggregation and exception aggregation have to be SUM.
    Features of non-cummulative key figures
    A non-aggregating key Figure.
    Records are not summarized for Reporting
    Exception Aggregation is being applied on these key figures on the level of one or more info objects usually with time .
    Examples: Head Count, Account balance, Material stock
    consider simple senario
    Date Net Stock Quantity Sales Revenue
    01.02.2005 40 1000
    02.02.2005 50 2000
    03.02.2005 25 3000
    this is the query output if stock quantity has treated as cummulative and non-cummulative key figures
    if stock quantity taken as a cummulative key figure
    Date NET STOCK QUANTITY SALES REVENUE
    01.02.2005 30 1000
    02.02.2005 50 2000
    03.02.2005 20 3000
    RESULT 100 6000
    in the above result the key figure has aggregated to the total value that wont give sense to the net stock quantity
    if stock quantity taken as non-cummulative key figure
    Date Net Stock Quantity (LAST) Sales Revenue
    01.02.2005 30 1000
    02.02.2005 50 2000
    03.02.2005 20 3000
    RESULT 20 6000
    Hope it helps you.
    Thanks & Regards,
    SD

  • How does Non-cumulative KeyFigure Works ?

    Hi, experts !
    What's the difference between the non-cumulative values of summation and of last value?
    And how does non-cumulative value with in-flow and out-flow works?
    Does anyone have relative documents?
    Plz post a copy to my e-mail: [email protected]
    Thank you very much !!!
    Looking forwards !

    Hi, Can you also send me the doc.  My Id is [email protected]
    Thanks,
    Sam

  • How to load init for non cumulative values?

    Hi folks,
    anyone here, who can tell me how to load initial amounts from DSO to non cumulative values in InfoCubes?
    I found only poor documentation about this.
    Thanks!

    hi Timo
    you load initial of load as you do in the normal case only thing you have to keep in mind that before loading opening balance you have to UNCLICK  NO MARKER UPDATE from the infocube and compress the request ASAP as it will greatly effect your query performance.
    sanjeev

  • Non cumulative with non-cumulative change

    Hi Experts,
    I have a Question !, As i know that i can create Non-Cumulative Figure By Means Of two ways:-
    1. Non cumulative with non-cumulative change and
    2. Non Cumulative with Inflow and Outflow.
    the second one is very clear but i am not very clear about 1st option i.e What Non-Cumulative Value changes is? and how it calculates the non-cumulative Value.
    If u have an Idea please share your knowledge...
    Ragards
    Prakash

    The 1st option is for key figures like headcount or similar, which you would like to store as a total figure in a cube instead of a DSO in a scenario where you are using a multiprovider on this together with other non-cumulative cubes.
    The SAP help below explains this quite well
    http://help.sap.com/saphelp_nw04/helpdata/en/8f/da1640dc88e769e10000000a155106/frameset.htm
    Best,
    Ralf

  • Purchasing Data Not Tying Up between BW and R/3.

    Hi,
    As a part of our project, we reloaded the all the data related to Purchasing through the extractors   
    2LIS_02_SCL
    2LIS_02_ITM
    2LIS_02_HDR
    As these were corrupted due to patch application.
    We took all the possible care during reloading of all the data from R/3 like locking the user from making any transactions, Deleting the setup tables etc in a step by step process etc.
    But however, after the extraction the data is not tying up between R/3 and BW Purchasing ODS.
    Is there some thing wrong did we do or How can we get that missing records into BW again.
    Appreciate your help.
    Thanks.

    Hi......
    Did u fill the set up table with any selection...........?.....Previously no of records picked was larger than this time........right?..........Hav u checked RSA3 in the source system..........is it showing correct no of records........? if yes............then check the selection tab of the IP..whether u hav maintained any selection there.........
    If no..........then......first check the selection in the set up table..............then the function module....which is extracting the data..........u will get the function module in RSA2.........there just double click on the Extractor.........u will get the function module  name........
    Anyways.........did u know which records r missed.........then just fill the set up table for those values only.........then do full load......
    Anyways,............I think u know the steps in LO Extraction...........still for ur reference..........
    1. Transfer the logistics DataSource for transaction data from Business Content
    With the transfer of the DataSource, the associated extraction structure is
    also delivered, but the extraction structure is based on LIS communication
    structures. Furthermore, based on the extraction structure for the DataSource,
    a restructuring table that is used for the initialization and full update to BI
    is generated.
    Naming convention:
    DataSource 2LIS_<Application>_<Event><Suffix>; where <Event> is
    optional
    Examples:
    2LIS_02_HDR: 02 = MM purchasing HDR (u2192 HEADER ) = Document
    header...
    Extraction structure MC<Application><Event/group of
    events>0<Suffix>; where MC is derived from the associated communication
    structures and <Suffix> is optional
    Examples:
    MC02M_0HDR: Extraction structure for the DataSource 2LIS_02_HDR,
    where M_ indicates the group for the events MA (order), MD (delivery
    schedule), ME (contact) and MF (request).
    Restructuring table (= setup table) <Extraction structure>SETUP
    Example:
    Extraction structure: MC11VA0IT u21D2 Restructuring table:
    MC11VA0ITSETUP
    2. Maintain extraction structure (transaction LBWE)
    This means that fields can be added to the extraction structure that is
    delivered with the DataSource without modifying anything. On the one
    hand, fields from the LIS communication structures that are assigned to the
    extraction structure can be used, that means standard fields that SAP has
    not selected, and on the other hand, customer fields that were attached to
    the LIS communication structures with the append technique can be used.
    After the extraction structure is created, it is generated automatically and the
    associated restructuring table is adapted.
    3. Maintain/generate DataSource
    In the DataSource maintenance, you can assign the properties Selection,
    Hide, Inversion (= Cancellation) and Field Only Known in Customer Exit to
    the fields of the extraction structure. After enhancing the extraction structure,
    the DataSource always has to be generated again!
    4. Replicate and activate DataSource in SAP BI (=metadata upload)
    5. Maintain Data Target (Data Store Object, InfoCube)
    6. Maintain Transformations between DataSource and Data Target
    7. Create a Data Transfer Process
    the Data Transfer Process will be used later to update the data from the PSA
    table into the Data Target.
    8. Set extraction structure for updating to active (transaction LBWE)
    In this way, data can be written to the restructuring table or the delta queue
    from then on using the extraction structure (see following steps).
    9. Filling the restructuring table/restructure (OLI*BW)
    During this process, no documents should be created or changed in the
    system! In some applications, it is possible to fill the restructuring table
    beforehand in simulation mode. These results are listed in a log (transaction
    LBWF). Before filling the restructuring table, you must ensure that the
    content of the tables is deleted (transaction LBWG), preventing the table
    from being filled multiple times. Once the restructuring tables are filled,
    document editing can resume as long as Unserialized V3 Update or Queued
    Delta is selected in the next step. Be absolutely sure that no V3 collection
    run is started until the next successful posting of an InfoPackage for delta
    initialization (see step 11).
    10. Select update method
    • Unserialized V3 update
    • Queued delta
    • Direct delta
    11. Create an InfoPackage for the DataSource and schedule the Delta
    Initialization in the Scheduler
    This updates the BI-relevant data from the restructuring table to the
    PSA table. Since the restructuring table is no longer needed after delta
    initialization, the content can be deleted (transaction LBWG).
    Use the Data Transfer Process created in step 7 to update the data from
    the PSA table into the Data Targets. After successful delta initialization,
    document editing can resume, as long as the direct delta update method
    was selected in step 13. This means that BI-relevant delta data is written
    directly to the delta queue.
    Note:
    If the DataSource supports early-delta initialization, the delta data can
    be written to the delta queue during delta initialization. This feature is
    controlled with an indicator in the Scheduler.
    12. Start V3 collection run (transaction LBWE)
    This step is only necessary when the update method unserialized V3 Update
    or Queued Delta was selected in step 10. By starting a corresponding job for
    an application, the BI-relevant delta data is read from the update tables or
    extraction queue and written to the delta queue.
    13. Create an InfoPackage for the DataSource in BI and schedule the Delta
    Update in the Scheduler
    The BI-relevant delta data from the delta queue for the DataSource is updated
    to the PSA table. Use the Data Transfer Process created in step 7 to update
    the data from the PSA table into the Data Targets.
    Hope this helps.......
    Regards,
    Debjani...

Maybe you are looking for

  • Problem Related To PO Price Change

    Hi , I m having Problem in the PO release strategy implementation tht I think the release strategy if once implemented thn Nobody will be able to change PO even in change mode if yes thn it will be of no use as I only want tht nobody will be able to

  • HR-ABAP issue

    im learning hr-abap form past 1 week, im using PNP LDB please can you tell what needs to be done for making sure pernr is mandatory as the field is already present on the selection screen,.

  • Is it possible to transfer 09 Pages program on to new macbook pro?

    Simply cant download it again as its been removed from the app store??? Hoping its as easy as loading it on to usb and transfer if not will have to call applecare. Any advice appreciated

  • TCP connections alive well after Firefox is closed

    According to Sysinternals TCPView, I have several TCP connections that stay open (established) well after I close and exit Firefox, I'm talking 8 hours after. Normal connections close within minutes if shutting down Firefox. I'd post a pic, but there

  • Help! The disp+work process of my portal server can't start!

    trc file: "dev_disp", trc level: 1, release: "700" sysno      00 sid        EP7 systemid   560 (PC with Windows NT) relno      7000 patchlevel 0 patchno    146 intno      20050900 make:      multithreaded, Unicode, optimized pid        4144 Mon Jul 1