PR release date capturing for all levels

Hi,
I have activated PR release strategy at Header level.  There are about 4 levels of release are there.  In custom PR print program, I want to capture the details of all levels corresponding release dates.  Can you help me how I can go about it.
Munna.

Hi
PR release date is not relevant to Release strategy.
Purchase Requisition Release Date
Specifies the date on which the purchase order should be initiated on the basis of the purchase requisition.
The release date is based on:
The purchasing department processing time defined for the plant
The planned delivery time from the material master record or purchasing info record
The delivery date
The goods receipt processing time from the material master record
Note
The planned delivery time from the purchasing info record and the GR processing time are only taken into account if the purchase requisition was generated via materials planning.
Example
    Processing time   Planned del.   GR processing
       Purchasing         time           time
Release       PO                    Delivery    Date
date         date                   date    required
Date required:                     10.01.96
GR processing time:                 2 days (working days)
Planned delivery time:             10 days (calendar days)
Purchasing dept processing time:    2 days (working days)
For the material to be available on the date it is needed, the purchase requisition must be released on 09.11.96 (requirement date less GR processing time, planned delivery time, and purchasing department processing time).
Hope it helps
Thanks/karthik

Similar Messages

  • Can we load data for all levels in ASO?

    Hi All,
    Im creating cube in ASO
    can i load data for all levels in ASO
    we can load data for all Levels In BSO but in ASO i need confirmation????
    and one more
    wat is the consider all levels option in ASO is used for ? wat is the purpose?
    Can any one help ,it would be appriciate.
    Thanks

    In an ASO cube you can only load to level zero
    The consider all levels if used for aggregation hints. It allows you to tell the aggregation optimizer to look at all levels when deciding if aggregation needs to be done on the dimension

  • Can we load data for all levels in ASO cube

    Hi All,
    Can we load data for all levels of members in ASO cube in 9.3.1.
    Regards

    Yes you can load data for all levels in an ASO cube in any version HOWEVER, none of the upper level data in any cube will be there when you look for it. You will get a warning message in the load because ASO cubes don't store data at upper levels. It is the same as loading data into dynamic calc members in BSO cube. It will do the load without compalints, but there will be no data there (At least you get the warning in ASO)

  • Excise is not captured for all or Second line item.

    Dear Experts,
    Excise is not updating for all the line items in while making Goods Receipt.
    while making goods receipt excise is not capturing for all the line items.
    but excise is captured at header level,its presumed that all the line items
    are capturing excise,please suggest me,what might be the reason.
    Thanks in advance.
    Varun

    Hello,
    As i think, you are creating Goods receipt With Excise option "Only Part1" in MIGO, then system will update only J_1ipart1 table with qty and without internal document number, after J1IEX only system will Update J_1IEXCDTL  table with all value and qty and internal document number.
    Suppose you have done the GR -Option' 'No excise entry" then system will not generate any Excise realted tables. Then you should  Capture Part1 entry with help of J1I5 t code, then system will update qty field , here also system will not update internal doc number.
    Write a Updation program for J_1IPART1 table for qty field
    Mahesh Naik

  • Offline data capture for SQLServer2k shows 0 tables

    Hi,
    I'm evaluating OMWB Offline Data Capture for SQL Server 2000 migration.
    I ran OMWB_OFFLINE_CAPTURE.BAT script for SQL Server 2000. It seams like the DAT files are generated and no error messages appear. The problems occure when I start the OMW and try to "Capture Source Database".
    I specify the directory, where the generated DAT files reside, the DAT files appear in the file list and the status is AVAILABLE. But when I run the Capture with the Oracle Model Creation, I see among the LOG messages that "...Tables Mapped: 0...".
    I created TEST database with the table [tab] in it. In generated SS2K_SYSOBJECTS.dat file there is a row for this table:
    tab     \t     2041058307     \t     U      \t     1     \t     1     \t     1610612736     \t     0     \t     0     \t     0     \t     2006/09/26      \t     0     \t     0     \t     0     \t     U      \t     1     \t     67     \t     0     \t     2006/09/26      \t     0     \t     0     \t     0     \t     0     \t     0     \t     0     \t     0     \t     \r\n
    The rest objects are not in the Oracle Model too (I believe the user sa must have been created too).
    Please, anybody help with this problem.
    Pavel Leonov, Consultant
    Ispirer Systems Ltd.
    SQLWays - Data, schema, procedures, triggers conversion to Oracle, DB2, SQL Server, PostgreSQL, MySQL
    http://www.ispirer.com

    I changed the separators back to the default. But the Oracle Model still is not created. Still the same problem, there are no tables at all in the source database.
    Here is how the row for the table is specified in the SS2K_SYSOBJECTS.dat file:
    tab     ?     2041058307     ?     U      ?     1     ?     1     ?     1610612736     ?     0     ?     0     ?     0     ?     2006/09/26      ?     0     ?     0     ?     0     ?     U      ?     1     ?     67     ?     0     ?     2006/09/26      ?     0     ?     0     ?     0     ?     0     ?     0     ?     0     ?     0     ?     ?
    Here is some information from the log:
    Type: Information
    Time: 26-09-2006 15:13:56
    Phase: Capturing
    Message: Row delimiter being used for offline capture is ¤
    Type: Information
    Time: 26-09-2006 15:13:56
    Phase: Capturing
    Message: Column delimiter being used for offline capture is §
    Type: Information
    Time: 26-09-2006 15:13:57
    Phase: Capturing
    Message: Generating Offline Source Model Load Formatted File For SS2K_SYSOBJECTS.dat ,File Size: 5235
    Type: Information
    Time: 26-09-2006 15:13:57
    Phase: Capturing
    Message: Generated Offline Source Model Load File d:\DBClients\oracle\ora92\Omwb\offline_capture\SQLServer2K\itest\SS2K_SYSOBJECTS.XML
    Type: Information
    Time: 26-09-2006 15:14:27
    Phase: Creating
    Message: Mapping Tables
    Type: Information
    Time: 26-09-2006 15:14:27
    Phase: Creating
    Message: Mapped Tables
    Type: Summary
    Time: 26-09-2006 15:14:27
    Phase: Creating
    Message: Tables Mapped: 0, Tables NOT Mapped: 0
    By the way. After I try to create the Oracle Model, for each of the DAT files the XML file is created with the following content:
    <?xml version="1.0" encoding="Cp1251" ?><START><er></er></START>
    May be this will help to shed a light on the problem.
    Pavel

  • How can I monitor my monthly data usage for all 3 computers in my house? I have an Airport base station and it seems there should be software to monitor it from that point rather than monitoring the usage for each computer and then adding it up.

    How can I monitor my monthly data usage for all 3 computers in my house? I have an Airport base station and it seems there should be software to monitor it from that point rather than monitoring the usage for each computer and then adding it up.

    The following example was one of dozens that showed up on a simple Google search of.....
    monitor Internet data use on a Mac
    Watch your Internet usage with NetUse Monitor | Macworld
    Most service providers have an application for their users as well.

  • SQL Server 2012 Change Data Capture for Oracle by Attunity support for Oracle 12

    I would like to know if there are any plans for SQL Server 2012 Change Data Capture for Oracle by Attunity to support versions of Oracle 12 and if by when.

    I have asked from the author of
    http://blogs.msdn.com/b/mattm/archive/2012/03/26/cdc-for-oracle-in-sql-server-2012.aspx about this.
    I will either ask him to answer here or I would be a messenger.
    Balmukund Lakhani | Please mark solved if I've answered your question, vote for it as helpful to help other users find a solution quicker
    This posting is provided "AS IS" with no warranties, and confers no rights.
    My Blog |
    Team Blog | @Twitter
    Author: SQL Server 2012 AlwaysOn -
    Paperback, Kindle

  • Generic data source for muliple level BOM

    Hi Bhanu & everyone,
    Whats the best way to create a gereic data source for muliple level BOM (STKO, STPO & MAST tables) in BW? I also need to maintain the history.
    Thanks,
    Aniruh.

    Hi Anirudh,
    You can create a view of those tables using transaction se11 and then create a datasource using transaction rso2.
    Hope this helps.
    Regards,
    Diego

  • How do I fill in Release Date info for my music tracks?

    I would like to add specific release date information (ie, the month and date) of a cds and songs to the music I have in my library. Once I updated itunes to 7.2, I noticed there is a category for Release Date information. But I've looked at all of the info options and I can't seem to find where I would fill in that information. Can someone help me locate it?
    Thanks in advance.

    For clarification: When you look at the Get Info page for each file, I too can only find the "year" option to fill in.
    However, when you look at the columns for your library sorting, there is an option to sort by Release Date.
    This is why I would assume there is some place you can fill in the release date--I just can't happen to find it, which is what I'd like help with.

  • "Date Modified" for all files being changed if "Automatically write to XMP" is on

    Recently upgraded from LR3 (v3.4.1) from LR2 on OSX 10.6.8 and have always had Catalog Settings > Automatically write changes into XMP turned on.
    When browsing existing JPG files in my Library (no Develop changes, no keywording, no Presets, no Import), LR3 is writing to disk — i.e., when I look at files in Finder, almost every viewed file’s “Date Modified” is being set to today’s date and time. (It actually creates a .swp file, then changes it's name back to .jpg)
    This is really bad, as it's making it impossible for me to use Finder to figure out when I last worked with a file, it is triggering needless Time Machine and Backblaze backups, and unnecessarily churning my disk.
    If I turn off "Automatically write..." this behavior stops. Per David Marx at thelightroomlab.com, I tried turning off this preference, manually doing a "Save Metadata to File" for all files, letting that complete, then turning the preference back on. This does not solve the problem.
    Per a suggestion at photoshop.com, I used ExifTool to see what changes LR was writing to a sample file; from the diff below, you can see that LR is adding a bunch of new fields as well as moving other fields around. But my point is that LR3 should never overwrite a file on disk if all I am doing is browsing thru them.
    Is anyone else seeing this? Any ideas would be greatly appreciated!
    -- David
    diff Exif5609_original Exif5609_update
    2c2
    < FileName: DSC_5609_original.JPG
    > FileName: DSC_5609_update.JPG
    5,6c5,6
    < FileModifyDate: 2009:11:27 21:32:54-08:00
    < FilePermissions: rwxr-xr-x
    > FileModifyDate: 2011:08:07 22:06:47-07:00
    > FilePermissions: rw-r--r--
    27a28,29
    > ShutterSpeedValue: 1/200
    > ApertureValue: 7.1
    55d56
    < SerialNumber: 3209521
    75d75
    < Lens: 18-200mm f/3.5-5.6
    185,188d184
    < UserComment:
    < SubSecTime: 00
    < SubSecTimeOriginal: 00
    < SubSecTimeDigitized: 00
    211a208,299
    > XMPToolkit: Adobe XMP Core 5.2-c004 1.136881, 2010/06/10-18:11:35
    > CreatorTool: Ver.1.00
    > MetadataDate: 2011:08:07 22:06:47-07:00
    > SerialNumber: 3209521
    > LensInfo: 18-200mm f/3.5-5.6
    > Lens: 18.0-200.0 mm f/3.5-5.6
    > ImageNumber: 26634
    > RawFileName: DSC_5609.JPG
    > SavedSettingsName: Import
    > SavedSettingsType: Snapshot
    > SavedSettingsParametersVersion: 6.4.1
    > SavedSettingsParametersProcessVersion: 5.0
    > SavedSettingsParametersWhiteBalance: As Shot
    > SavedSettingsParametersIncrementalTemperature: 0
    > SavedSettingsParametersIncrementalTint: 0
    > SavedSettingsParametersExposure: 0.00
    > SavedSettingsParametersShadows: 0
    > SavedSettingsParametersBrightness: 0
    > SavedSettingsParametersContrast: 0
    > SavedSettingsParametersSaturation: 0
    > SavedSettingsParametersSharpness: 0
    > SavedSettingsParametersLuminanceSmoothing: 0
    > SavedSettingsParametersColorNoiseReduction: 0
    > SavedSettingsParametersChromaticAberrationR: 0
    > SavedSettingsParametersChromaticAberrationB: 0
    > SavedSettingsParametersVignetteAmount: 0
    > SavedSettingsParametersShadowTint: 0
    > SavedSettingsParametersRedHue: 0
    > SavedSettingsParametersRedSaturation: 0
    > SavedSettingsParametersGreenHue: 0
    > SavedSettingsParametersGreenSaturation: 0
    > SavedSettingsParametersBlueHue: 0
    > SavedSettingsParametersBlueSaturation: 0
    > SavedSettingsParametersFillLight: 0
    > SavedSettingsParametersVibrance: 0
    > SavedSettingsParametersHighlightRecovery: 0
    > SavedSettingsParametersClarity: 0
    > SavedSettingsParametersDefringe: 0
    > SavedSettingsParametersHueAdjustmentRed: 0
    > SavedSettingsParametersHueAdjustmentOrange: 0
    > SavedSettingsParametersHueAdjustmentYellow: 0
    > SavedSettingsParametersHueAdjustmentGreen: 0
    > SavedSettingsParametersHueAdjustmentAqua: 0
    > SavedSettingsParametersHueAdjustmentBlue: 0
    > SavedSettingsParametersHueAdjustmentPurple: 0
    > SavedSettingsParametersHueAdjustmentMagenta: 0
    > SavedSettingsParametersSaturationAdjustmentRed: 0
    > SavedSettingsParametersSaturationAdjustmentOrange: 0
    > SavedSettingsParametersSaturationAdjustmentYellow: 0
    > SavedSettingsParametersSaturationAdjustmentGreen: 0
    > SavedSettingsParametersSaturationAdjustmentAqua: 0
    > SavedSettingsParametersSaturationAdjustmentBlue: 0
    > SavedSettingsParametersSaturationAdjustmentPurple: 0
    > SavedSettingsParametersSaturationAdjustmentMagenta: 0
    > SavedSettingsParametersLuminanceAdjustmentRed: 0
    > SavedSettingsParametersLuminanceAdjustmentOrange: 0
    > SavedSettingsParametersLuminanceAdjustmentYellow: 0
    > SavedSettingsParametersLuminanceAdjustmentGreen: 0
    > SavedSettingsParametersLuminanceAdjustmentAqua: 0
    > SavedSettingsParametersLuminanceAdjustmentBlue: 0
    > SavedSettingsParametersLuminanceAdjustmentPurple: 0
    > SavedSettingsParametersLuminanceAdjustmentMagenta: 0
    > SavedSettingsParametersSplitToningShadowHue: 0
    > SavedSettingsParametersSplitToningShadowSaturation: 0
    > SavedSettingsParametersSplitToningHighlightHue: 0
    > SavedSettingsParametersSplitToningHighlightSaturation: 0
    > SavedSettingsParametersSplitToningBalance: 0
    > SavedSettingsParametersParametricShadows: 0
    > SavedSettingsParametersParametricDarks: 0
    > SavedSettingsParametersParametricLights: 0
    > SavedSettingsParametersParametricHighlights: 0
    > SavedSettingsParametersParametricShadowSplit: 25
    > SavedSettingsParametersParametricMidtoneSplit: 50
    > SavedSettingsParametersParametricHighlightSplit: 75
    > SavedSettingsParametersSharpenRadius: +1.0
    > SavedSettingsParametersSharpenDetail: 25
    > SavedSettingsParametersSharpenEdgeMasking: 0
    > SavedSettingsParametersPostCropVignetteAmount: 0
    > SavedSettingsParametersGrainAmount: 0
    > SavedSettingsParametersLensProfileEnable: 0
    > SavedSettingsParametersLensManualDistortionAmount: 0
    > SavedSettingsParametersPerspectiveVertical: 0
    > SavedSettingsParametersPerspectiveHorizontal: 0
    > SavedSettingsParametersPerspectiveRotate: 0.0
    > SavedSettingsParametersPerspectiveScale: 100
    > SavedSettingsParametersConvertToGrayscale: False
    > SavedSettingsParametersToneCurveName: Linear
    > SavedSettingsParametersCameraProfile: Embedded
    > SavedSettingsParametersCameraProfileDigest: D6AF5AEA62557FCE88BC099788BBD3CC
    > SavedSettingsParametersLensProfileSetup: LensDefaults
    > SavedSettingsParametersToneCurve: 0, 0, 255, 255
    > IPTCDigest: d41d8cd98f00b204e9800998ecf8427e
    228,230d315
    < SubSecCreateDate: 2009:11:27 21:32:54.00
    < SubSecDateTimeOriginal: 2009:11:27 21:32:54.00
    < SubSecModifyDate: 2009:11:27 21:32:54.00
    http://feedback.photoshop.com/photoshop_family/topics/lr3_date_modified_for_all_files_bein g_updated_when_browsing_photos_if_catalog_settings_automatically_write_changes_into_xmp_is /replies/6313647

    clvrmnky wrote:
    davidpope007 wrote:
    Then when LR3 loaded my old LR2 images into memory, it "dirtied" the in-memory copy of the file by adding in these new LR3 XMP fields. Then, because I had "Automatically write XMP" on, it said "I better write these changes to disk".
    Yuck. As a former software engineer, this is very bad software engineering.
    It should wait until the user dirties the file (via Develop, keywords, etc.) before presuming to add a bunch of metadata fields that are unique to the new version of LR3.
    Well, I'm a current software developer, and this is, really, a perfectly reasonable thing to do. It is a reasonable trade-off for a convenient feature required by a small subset of users.
    Yes, in most cases the in-memory copy should "never" be dirtied unless the user makes a gesture of some sort, but like I said earlier, this option (once set by the user) sets up the situation where this gesture becomes implicit. This is a clear trade-off for the sake of convenience. And if the XMP is out of date and needs to be updated en masse, so be it.
    The fact is, there is no easy way around this. Do we save up /every/ dirty buffer somehow until you make a gesture that /might/ require the XMP to be up-to-date before acting on that gesture? Now we have to worry about unflushed buffers if something goes wrong and the app exits. Do we save the buffers to the DB? Now we have to block some calls to make another blocking call to flush some or all of those to DB, and then write some or all of it out to one or more files. In what order? What if there is a gesture to have X files with up-to-date XMP and some or all of those are in unflushed buffers, unflushed DB writes or we have to wait for the DB.
    As you can see, this is a transactionality nightmare, and the easiest and safest thing to get what the user wants (i.e., up-to-date XMP for the purpose of talking to a third-party XMP aware app) is to simply update the sidecar or XMP block in an atomic manner using the correct file IO. The file will have to change at some point, so it may as well be now.
    [Thanks to both of you for your detailed replies. I am aware of the need for tradeoffs so when you say the approach taken is quite reasonable, I do believe you. I also apologize in advance for the length of the following and am extremely aware of the time it must have taken you to compose the above replies, but I'm going to add a bit more, if only for my own piece of mind and in hopes of coming up with a solution for my workflow.]
    From my naive point of view, I was expecting the answer to be simply "don't raise the XMPDirtyFlag upon reading in a file". Obviously if your architecture requires you to "upgrade to latest XMP format" upon read, and another part of the system auto-detects "out of date XMP", then it's going to write those changes to disk.
    But it didn't need to be designed that way. LR obviously has mechanisms to know when a user has made a change to XMP so it is able to write XMP changes to disk only when necessary.
    The promise (to me) of "Automatically Write XMP changes to disk" was to auto-save my changes, and not those made for any internal (i.e., XMP versioning related) changes.
    Perhaps the premise is that it is LR3's job to update an individual file's XMP to the latest version so that other XMP-aware apps can make use of it? I would argue that those third-party XMP-aware apps already have to know how to deal with all prior versions of XMP, so LR3 should just leave well enough alone.
    You asked if my problem with your approach was that it was "inelegant"; not at all, it is based on my own perception of what I need from my workflow, so let me describe that so maybe we can find a better way:
    * Part of the appeal of LR to me is that it preserves my original file as it came off the memory card, allowing me to move to a different workflow/toolset in 2025 if I choose to do so
    * However, with all of changes contained in a single database file, I'm concerned about rare (but possible) corruption, so to mitigate this risk, I let LR backup my database weekly and it's also backed up continuously in the cloud via Backblaze
    * Even with backups of the database, there is still a chance that I could lose changes made to individual files (e.g., LR corrupts the DB and I have to go back to last week's DB)
    * Thus the appeal of the "auto-write to XMP" flag -- that way critical changes (develop, crop, keywords) are saved on a per-file basis; I liked the "automatic" part of this (as opposed to a manual save) because then I don't have to teach others in my family how to manually save XMP changes
    * A nice side-effect of this setting is now when I use Finder to find a file and double-click on it to edit it in Photoshop, all my develop changes are right there; (in other words, I like the flexibility of not having to fire up LR in order to just invoke PS from within it); also when I use Bridge I see all the keywords there
    * So with LR2, I had gotten used to what I thought was the best of all worlds -- autosave of changes at the file level via XMP + raw negatives untouched (i.e., Date Modified == the date I took the picture); this allows me to use operating-system-level tools -- Finder -- to locate/search for files
    * Now I upgrade to LR3 and I'm finally now understanding that a concept "XMP versioning" is going to result in changes to many, but not all my files. (That's something else that's annoying about this issue -- I open up the Grid and browse a folder of files, and only seemingly random ones I've cursored over seem to get written to disk -- if it's so urgent that LR3 update the XMP, then it should do it for all the files in the catalog or at least in that directory)
    Here's a screenshot from Finder of what I see everytime I look at this folder:
    * So now I have to assume that each new version of XMP and/or LR is going to touch my files on disk. Sigh.
    * What I don't like about this is that it is ruining the promise of "untouched raw negative". Yes, the image data is untouched -- which I agree is most of the benefit; but the file has been touched.
    * Perhaps you might empathize a bit more if you imagined that someone went thru all your source code or Word files and randomly changed the date to "today" because you upgraded compilers or moved to Word 2011.
    I agree all of this would be solved by having an XMP sidecar file for JPGs, but you indicate that's not going to happen.
    You've also alluded to the solution of "resetting the Date Modifed" to it's original value -- which I believe is what Finder does when you move or copy a file -- but that that is fraught with issues as well. I believe you when you say there are issues, but again the naive part of me wonders why that soultion would be so bad...
    I just thought of another potential solution -- turning on Date Created in Finder -- but it turns out that's changed, too.
    I am really at a loss as to what to do and would welcome your suggestions.
    Thanks again and kind regards,
    -- David

  • ITunes "Released" dates on podcasts all the same and wrong.

    I've noticed this going on for some time now but am just getting around to figuring it out. 
    In my iTunes the released dates for my podcasts always come up as the same incorrect date. 
    For some reason they all show a release date of January 5, 2008.  Not sure if the date is significant or not. 
    This is a brand new computer running 10.7.2.  When I set it up though I did use the migration assistant and it brought this issue from my old computer to the new one. 
    I've done a lot of googling on the subject and can't find any online reference to a similar problem. 
    Hoping there is someone on the forums that can figure this out. 
    Thanks!

    First I'd set up a System Restore point. Then try applying the LNK registry fix from the following document:
    File Association Fixes for Windows 7
    Does that get your icons back to normal?

  • Data reconcilation for All GL accounts through FDFD

    Dear All,
    I have executed the data reconciliation for GL account through FDFD, after executing this job; I am not able to view any data in the cash management report through FF7A. its showing no data exists.
    Please let me know how to rectify this to view the cash position as per the GL balances.
    Regards,
    Rama Mohan

    Hello Rama,
    It is a little late but maybe this answer will be helpful for someone else...
    Something like this happened to me when I executed "Data Reconciliation", the thing that was happening is that the job didn't finish so it could just erase data but was not able to get data back, what I did was consult the log of the job through System --> Services --> Jobs --> Job Overview or transaction SM37, then I searched for my jobs and I saw that they were all canceled and the log said "The name of the printer ' ' doesn't exist" so what I did was add a new printer in system --> user profile --> Own data --> Defaults --> OutputDevice and I added a local printer like "LP01"; after this, I executed Data Reconciliation again and the job was succesful and it got data back.
    I hope this can help.
    Regards
    Erika Zagal de la Luz

  • MPS run, planned orders were created for all levels ?

    Hello PP members
    I ran a small scenario
    material    MRP Type     Low Level    SG           M/T Type
    A      M0         000             40              FERT
    B               PD              001              40             HALB
    c               PD              002               10             ROH
    Maintained PIRs(MD61) for material A, and ran MD41( Single Item Multi level Planning)
    As material A is an MPS item, I was expecting that it should create planned orders only for material A, but in this MPS run it created planned orders for material B & C. (Looked into MD04, where I see planned orders were created
    for all the levels)
    As per the MPS run, it will plan only for one level of BOM (in this scenario for material A)
    Any suggestions, why planned orders were created for material B & C
    please clarify

    Just check
    System must have created a dependent requirement planned order and not the planned order for requirement you put in demand management for B and C
    i.e. if you enter a demand in MD61 for B and C, run the transaction, system will not consider this requirement during MPS run. MPS run will consider the requirement if the requirement for the child is comming from parement where parent is MPS item.
    Hence you need to run MRPm for B amd C if teh requirement(not dependdent requirement from A) is to be considered.
    I hope you are clear
    Edited by: Rajesha Vittal on Jan 28, 2008 8:06 AM

  • Generic Data Access For All Class

    Hello
    I am doing one experiment on Data Access. In traditional system We have to write each Insert, Update, Delete code in data access for each table.
    My City Table Class:
    public class TbCitiesModel
    string _result;
    int _cityID;
    int _countryID;
    string _name;
    int _sortOrder;
    bool _enable;
    DateTime _createDate;
    string _countryName;
    public string result
    get { return _result; }
    set { _result = value; }
    public int cityID
    get { return _cityID; }
    set { _cityID = value; }
    public int countryID
    get { return _countryID; }
    set { _countryID = value; }
    public string name
    get { return _name; }
    set { _name = value; }
    public int sortOrder
    get { return _sortOrder; }
    set { _sortOrder = value; }
    public bool enable
    get { return _enable; }
    set { _enable = value; }
    public DateTime createDate
    get { return _createDate; }
    set { _createDate = value; }
    public string countryName
    get { return _countryName; }
    set { _countryName = value; }
    Traditional Code:
    public List<TbCitiesModel> DisplayCities()
    List<TbCitiesModel> lstCities = new List<TbCitiesModel>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand("STCitiesAll", connection))
    command.CommandType = CommandType.StoredProcedure;
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    lstCities.Add(new TbCitiesModel());
    lstCities[lstCities.Count - 1].cityID = Convert.ToInt32(reader["cityID"]);
    lstCities[lstCities.Count - 1].countryID = Convert.ToInt32(reader["countryID"]);
    lstCities[lstCities.Count - 1].name = Convert.ToString(reader["name"]);
    lstCities[lstCities.Count - 1].sortOrder = Convert.ToInt32(reader["sortOrder"]);
    lstCities[lstCities.Count - 1].enable = Convert.ToBoolean(reader["enable"]);
    lstCities[lstCities.Count - 1].createDate = Convert.ToDateTime(reader["createDate"]);
    return lstCities;
    The above code is used to fetch all Cities in the table. But when There is another table e.g  "TBCountries" I have to write another method to get all countries. So each time almost same code but just table and parameters are changing.
    So decided to work on only one global Method to fetch data from Database.
    Generic Code:
    public List<T> DisplayCitiesT<T>(T TB, string spName)
    var categoryList = new List<T>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand(spName, connection))
    command.CommandType = CommandType.StoredProcedure;
    foreach (var prop in TB.GetType().GetProperties())
    string Key = prop.Name;
    string Value = Convert.ToString(prop.GetValue(TB, null));
    if (!string.IsNullOrEmpty(Value) && Value.Contains(DateTime.MinValue.ToShortDateString()) != true)
    command.Parameters.AddWithValue("@" + Key, prop.GetValue(TB, null));
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    int i = 0;
    TB = Activator.CreateInstance<T>();
    int colCount = reader.FieldCount;
    foreach (var prop in TB.GetType().GetProperties())
    if (prop.Name != "result" && i <= (colCount - 1))
    prop.SetValue(TB, reader[prop.Name], null);
    i++;
    categoryList.Add(TB);
    return categoryList.ToList();
    Calling method:
    TbCitiesModel c = new TbCitiesModel();
    Program p = new Program();
    List<TbCitiesModel> lstCities = p.DisplayCitiesT<TbCitiesModel>(c,"STCitiesAll");
    foreach (TbCitiesModel item in lstCities)
    Console.WriteLine("ID: {0}, Name: {1}", item.cityID, item.name);
    Now Its working fine but I have tested with 10,00,000 Records in TBCities Table following are the result.
    1. The Traditional method took almost 58 - 59 -  58 - 59 - 59 seconds for 5 time
    2. The Generic Method is took 1.4 - 1.3 - 1.5 - 1.4 - 1.4  [minute.seconds]
    So by the results of test is generic method is probably slower in performance [because its have 3 foreach loops] but the data is very big almost 10,00,000 lakes records. So it might work good in lower records.
    1. So My question is can I used this method for real world applications ?? Or is there any performance optimization for this method?
    2. Also we can use this for the ASP.NET C# projects??
    Owner | Software Developer at NULLPLEX SOFTWARE SOLUTIONS http://nullplex.com

    Hi
    Mayur Lohite,
    Q1:It is not reasonable compared Generic Code with Traditional Code. The main issue not Generic.
    After take a look at your Generic Code.  Reflection code get slower in performance.
    TB = Activator.CreateInstance<T>();
    As Reflection is truly late bound approach to work with your types, the more Types you have for your single assembly the more slow you go on. Basically few people try to work everything based on Reflection. Using reflection unnecessarily will make your application
    very costly.
    Here is a good article about this issue, please take a look.
    Reflection is Slow or Fast? A Practical Demo
    Q2:Or is there any performance optimization for this method?
    The article presents some .NET techniques for using Reflection optimally and efficiently.
    optimizing object creation with reflection
    Note optimize
    reflection with Emit
    method.
    Best regards,
    Kristin
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.
    Hello Kristin
    Please can you tell how I optimize reflection in my code.
    public List<T> DisplayCitiesT<T>(T TB, string spName)
    var categoryList = new List<T>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand(spName, connection))
    command.CommandType = CommandType.StoredProcedure;
    foreach (var prop in TB.GetType().GetProperties())
    string Key = prop.Name;
    string Value = Convert.ToString(prop.GetValue(TB, null));
    if (!string.IsNullOrEmpty(Value) && Value.Contains(DateTime.MinValue.ToShortDateString()) != true)
    command.Parameters.AddWithValue("@" + Key, prop.GetValue(TB, null));
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    int i = 0;
    TB = Activator.CreateInstance<T>();
    int colCount = reader.FieldCount;
    foreach (var prop in TB.GetType().GetProperties())
    if (prop.Name != "result" && i <= (colCount - 1))
    prop.SetValue(TB, reader[prop.Name], null);
    i++;
    categoryList.Add(TB);
    return categoryList.ToList();
    Thank you.
    Owner | Software Developer at NULLPLEX SOFTWARE SOLUTIONS http://nullplex.com

  • Nokia can't release fw 40 for all device!nokia bra...

    Hi, i have Nokia 5800 device but is a brand Tim and my last firmware version is 30.0.011, this version it's full of bug and telco tim (telecom italia mobile )don't release new version 31 or 40 and i can't install ovi free maps because required fw version 31!
    I buy a nokia telephone or a tim telephone????
    nokia can't give the same support for all nokia telephone???
    I'm very angry for this nokia politic isn't good!
    Message Edited by mario5588 on 03-Feb-2010 11:06 AM

    I bought it from o2 but without contract, thus i payed the same price as i would if i bought directly from nokia. Additionally, its Nokia who has to fix that, not Telco!
    Guarantee is provided by the manufacturer ONLY. Warranty is provided by the dealer ONLY. If your dealer rejects providing warranty (eg. not publishing updated firmware) you have the right to get guarantee.
    My provider here is o2 and cause of this topic i called them and asked why their latest firmware is still 11.0.021 while current official is 21.0.045. They told me that the o2 services would not run with the official firmware (ROFL) thus they release updates after 1 year !!! That means o2 N97 users are getting 21.0.045 at 02.02.2011.
    BUT
    After i told them that all o2 services are running well with the official firmware and that they should release updates faster for Nokia phones (because they're containing the most bugs compared to other manufacturers) they told me to write a letter (or fax) to them explaining the facts again so they can "debrand" my phone officially.
    So, i would write a fax to TIM and tell to release the latest update. If they don't agree i would tell them to debrand your phone. If they still disagree i would force an law issue against them.
    Nokia Bug Tracking System @ http://dev.an3k.de/bugtracker/

Maybe you are looking for