Explain statspack values for tablespace & file IO

10.2.0.2 aix 5.2 64bit
in the Tablespace IO Stats & File IO Stats section of statspack and awr reports can someone help to clear up a little confusion I have with the value for AV Reads/s & AV Rd(ms). I'll reference some values I have from one of my reports over a 1 hour snap shot period, with the first three columns being reads, av rd/s, av rd(ms) respectively for both sections.
For Tablespace IO I have the following.
PRODGLDTAI
466,879 130 3.9 1.0 8,443 2 0 0.0
For File IO I have the following for each file within this tablespace.
PRODGLDTAI /jdb10/oradata/jde/b7333/prodgldtai04.dbf
113,530 32 2.6 1.0 1,302 0 0 0.0
PRODGLDTAI /jdb14/oradata/jde/b7333/prodgldtai03.dbf
107,878 30 1.6 1.0 1,898 1 0 0.0
PRODGLDTAI /jdb5/oradata/jde/b7333/prodgldtai01.dbf
114,234 32 5.8 1.0 2,834 1 0 0.0
PRODGLDTAI /jdb5/oradata/jde/b7333/prodgldtai02.dbf
131,237 36 5.2 1.0 2,409 1 0 0.0
From this I can calculate that there were on average 129.68 reads every second for the tablespace and that matches what is listed. But where does the av rd(ms) come from? If there are 1000 milli-seconds in a second and there were 130 reads per second, doesn't that work out to 7.6 ms per read?
What exactly is av rd(ms)? Is it how many milli-seconds it takes on average for 1 read? I've read in the Oracle Performance Tuning doc that it shouldn't be higher than 20. What exactly is this statistic? Also, we are currently looking at the purchase of a SAN and we were told that value shouldn't be above 10, is that just a matter of opinion? Would these values be kind of useless on tablespaces and datafiles that aren't very active over an hours period of time?

10.2.0.2 aix 5.2 64bit
in the Tablespace IO Stats & File IO Stats section of statspack and awr reports can someone help to clear up a little confusion I have with the value for AV Reads/s & AV Rd(ms). I'll reference some values I have from one of my reports over a 1 hour snap shot period, with the first three columns being reads, av rd/s, av rd(ms) respectively for both sections.
For Tablespace IO I have the following.
PRODGLDTAI
466,879 130 3.9 1.0 8,443 2 0 0.0
For File IO I have the following for each file within this tablespace.
PRODGLDTAI /jdb10/oradata/jde/b7333/prodgldtai04.dbf
113,530 32 2.6 1.0 1,302 0 0 0.0
PRODGLDTAI /jdb14/oradata/jde/b7333/prodgldtai03.dbf
107,878 30 1.6 1.0 1,898 1 0 0.0
PRODGLDTAI /jdb5/oradata/jde/b7333/prodgldtai01.dbf
114,234 32 5.8 1.0 2,834 1 0 0.0
PRODGLDTAI /jdb5/oradata/jde/b7333/prodgldtai02.dbf
131,237 36 5.2 1.0 2,409 1 0 0.0
From this I can calculate that there were on average 129.68 reads every second for the tablespace and that matches what is listed. But where does the av rd(ms) come from? If there are 1000 milli-seconds in a second and there were 130 reads per second, doesn't that work out to 7.6 ms per read?
What exactly is av rd(ms)? Is it how many milli-seconds it takes on average for 1 read? I've read in the Oracle Performance Tuning doc that it shouldn't be higher than 20. What exactly is this statistic? Also, we are currently looking at the purchase of a SAN and we were told that value shouldn't be above 10, is that just a matter of opinion? Would these values be kind of useless on tablespaces and datafiles that aren't very active over an hours period of time?

Similar Messages

  • How to set default value for html:file in struts

    hi
                   i am working on an application using struts situation is i am using the
    <html:file property="" /> to allow the user to upload file. but i need that a default file should be set so that even if the user donot click browse ie                the user wish to upload that default file the default file get uploaded on page submition.     Is it possible to set any default value for html:file ...if yes plz provide any suggestion or links how to achieve this.     thanks in advance

    www.google.com?q=STRUTS+DOCUMENTATION

  • Can I use MD5 value for indexing files ?

    I would like to know if MD5 value for each unqiue file is also unique. I am wonder if U can use MD5 value of a file for indexing. Any suggestion ??

    I would like to know if MD5 value for each unqiue file
    is also unique.No, since the number of MD5 hashes is less than the number of possible files. Of course, if you don't have many files the probability of clashes is pretty low. There's some theory about this which you'll find in algorithms textbooks where they talk about hash tables.
    I am wonder if U can use MD5 value of
    a file for indexing. Any suggestion ??Why? Don't you want your index to tell you something about the contents of the file?

  • Reason to set MaxErrorCount along with Propogate value for Excel Files

    Hi,
    I had an ETL earlier which processed CSV files from a folder. The requirement was to load the error files into a Rejected folder but continue to process all the files from the source Folder. This requirement was satisfied by setting the value of the
    system variable "Propogate" in EventHandler section for DataFlowTask to "false".
    MaxErrorCount value for the ForEach Loop is having its default value as "1".
    When tested, it was working perfectly fine
    Now currently there is an Excel file as source having same requirement as above. But even when then variable "Propogate" was set to "false" as above, the execution was getting stopped when an error file was found.
    Later I found the below link :
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/0065b5c2-f633-4cbf-81f9-031d1ed25555/how-to-skip-wrong-excel-files-ignore-errors-on-ssis-foreach-task
    As mentioned in the link, I have done below changes
    1)Set MaxErrorCount for ForEachLoop to 1000
    2)"Propogate" variable is set to false
    Now this is working perfectly fine.
    Can you please let me know why did the setting of "Propogate" worked for CSV but did not work for Excel. For Excel files why was an additional setting for MaxErrorCount  was required.
    It would be very much helpful if I can get a detail information on the above
    Thanks,
    Raksha
    Raksha

    Hi Raksha,
    In you SSIS package, setting the Propagate system variable to False of the Event Handler prevents the Data Flow Task (DFT) propagating its failure to its parent container. For the ValidateMetadata property of a Data Flow Task Component, it doesn’t only affect
    the DFT at design time but also at runtime. During the Validation phase, a component is supposed to check to make sure that the cached metadata in the package is still in sync with the underlying table/view/query. If there is a mismatch, the component returns
    a special status (VS_NEEDSNEWMETADATA). When this happens at design-time, SSIS triggers a metadata refresh by calling ReinitializeMetadata(). At runtime, this results in an error.
    In your scenario, by setting the ValidateMetadata property to false, the error of the DFT is avoided, hence, you don’t need to set the MaxErrorCount to a value more than 1.
    For more information about how ValidateMetadate property works, please see Matt’s answer in the following thread:
    http://stackoverflow.com/questions/3297097/validateexternalmetadata-property-what-exactly-does-this-do 
    Regards,
    Mike Yin
    TechNet Community Support

  • No "Auto Extend" option for tablespace files in sqldeveloper 4.0

    In Create Tablespace > File specification tab there is no “Auto Extend” option anymore in sql developer 4.0 (still present in 3.2). Maybe someone knows how to enable it if I missed something in configuration?
    Here is an illustration:
    http://i.piccy.info/i9/7c8d7a842786c5e2d07fb07464948813/1387918107/43886/670008/screen_14.png
    Thanks.

    Oh, really? Before rising this question I tried to search everywhere and did not find any mention of this defect. Thank you for confirm this.

  • Incorrect CRC16 value for ACE file

    Hi,
    I am trying to implement a Java app that will read ACE files. These are files that are compressed using an app such as WinAce.
    I have looked at the ACE file structure documentation that says that the first 2 bytes holds the CRC 16-bit for the file header. The header is usually 54-bytes long. The first 2 bytes is the CRC-16. The next two bytes holds the length of important ACE file info. The next single byte up until the end of the header is the section that is read and the CRC is calculated from it. So, for example, if the total header is 55 bytes, the CRC is calculated from bytes 4 to the end 54 (as first byte is 0).
    0      2           4             5                    ...                                54
    |CRC16 | Length    |     00      |  IMPORTANT ACE FILE INFO (BYTES 4 to 54 USED IN CRC16) |
    -------------------------------------------------------------------------------------------However, the CRC calculated by WinAce is "de 06" in hexadecimal and when I calculated the checksum I got "a2 94". I dont know why my checksum is incorrect.
    Here is my entire class that calculates the checksum:
    package org.ace.internals;
    public class CRCDemo
    // calculating 16-bit CRC
         * generator polynomial
        private static final int poly = 0x1021; /* x16 + x12 + x5 + 1 generator polynomial */
        /* 0x8408 used in European X.25 */
         * scrambler lookup table for fast computation.
        private static int[] crcTable = new int[256];
        static
            // initialise scrambler table
            for ( int i = 0; i < 256; i++ )
                int fcs = 0;
                int d = i << 8;
                for ( int k = 0; k < 8; k++ )
                    if ( ((fcs ^ d) & 0x8000) != 0 )
                        fcs = ( fcs << 1 ) ^ poly;
                    else
                        fcs = ( fcs << 1 );
                    d <<= 1;
                    fcs &= 0xffff;
                crcTable[i] = fcs;
         * Calc CRC with cmp method.
         * @param b byte array to compute CRC on
         * @return 16-bit CRC, signed
        public static short cmpCRC( byte[] b )
            // loop, calculating CRC for each byte of the string
            int work = 0xffff;
            for ( int i = 0; i < b.length; i++ )
                work = ( crcTable[( (work >> 8)) & 0xff] ^ ( work << 8 ) ^ ( b[i] & 0xff ) ) & 0xffff;
            return(short)work;
        public static void main(String[] args)
            //The relevant ACE header section that is used to calculate the CRC16
            byte[] bytes = new byte[]
                (byte)0x00, (byte)0x00, (byte)0x90, (byte)0x2a, (byte)0x2a, (byte)0x41,
                (byte)0x43, (byte)0x45, (byte)0x2a, (byte)0x2a, (byte)0x14, (byte)0x14,
                (byte)0x02, (byte)0x00, (byte)0xac, (byte)0x5a, (byte)0xe1, (byte)0x32,
                (byte)0x2b, (byte)0x0d, (byte)0x3e, (byte)0x23, (byte)0x00, (byte)0x00,
                (byte)0x00, (byte)0x00, (byte)0x16, (byte)0x2a, (byte)0x55, (byte)0x4e,
                (byte)0x52, (byte)0x45, (byte)0x47, (byte)0x49, (byte)0x53, (byte)0x54,
                (byte)0x45, (byte)0x52, (byte)0x45, (byte)0x44, (byte)0x20, (byte)0x56,
                (byte)0x45, (byte)0x52, (byte)0x53, (byte)0x49, (byte)0x4f, (byte)0x4e,
                (byte)0x2a
            CRCDemo crcdemo = new CRCDemo();
            short crc16bit = crcdemo.cmpCRC(bytes);
            System.out.println(Integer.toHexString(crc16bit));
    }Any hints and code is much appreciated. Thanks.
    Rizwan

    Hi,
    not sure if WinACE uses the same CRC algorithm and CRC seeds as WinZip.
    But here's a utility class I've been using a long while, which generates the exact CRC values as Winzip.
    regards,
    Owen
    import java.io.FileNotFoundException;
    import java.io.FileInputStream;
    import java.io.BufferedInputStream;
    import java.io.InputStream;
    public class CRC32
        protected static long []CRCTable;
        /* Initial Polynomial values may be choosen at random, in a 32-bit CRC algorithm
         * anything from 0 to 4294967295     ( ( 2 ^ 32 ) - 1 ).
         * However EDB88320 is the exact value used by PKZIP and ARJ, so we generate
         * identical checksums to them.  A nice touch to make testing easier.
        protected final static long CRC32_POLYNOMIAL = 0xEDB88320L;
        protected final static long INITIAL_CRC32    = 0xFFFFFFFFL;
        static
            CRCTable = new long[256];
            int i, j;
            long crc;
            for (i = 0; i <= 255; i++)
                crc = i;
                for (j = 8; j > 0; j--)
                  if ( ( crc & 1 ) != 0 )
                      crc = ( crc >> 1 ) ^ CRC32_POLYNOMIAL;
                  else
                      crc >>= 1;
                CRCTable[ i ] = crc;
        public static long getInitialCRC ( )
            return ( INITIAL_CRC32 );
        public static long calcFileCRC32 ( String filename ) throws Exception
            FileInputStream inputStream = new FileInputStream ( filename );
            CRC32InputStream checksumInputStream = new CRC32InputStream ( inputStream );
            long crc32 = calcFileCRC32 ( checksumInputStream );  // inputStream );
            // long crc32 = calcFileCRC32 ( inputStream );
            long checksum = checksumInputStream.getChecksum();
            System.out.println ("CRC32InputStream : " + Long.toHexString ( checksum ) );
            if ( inputStream != null )
                 inputStream.close();
            return ( crc32 );
        public static long calcFileCRC32 ( InputStream inStream ) throws Exception
            long lCrc = INITIAL_CRC32;
            int iCount = 0;
            byte []buffer = new byte [ 4096 ];
            BufferedInputStream myFastReader = new BufferedInputStream ( inStream );
            while ( ( iCount = myFastReader.read ( buffer ) ) > 0 )
                // lCrc = calculateBufferCRC (buffer, iCount, lCrc);
                lCrc = calculateBufferCRC (buffer, 0, iCount, lCrc);
           lCrc ^= INITIAL_CRC32;
           return ( lCrc );
        public static long calcCRC32 ( String aStr )
            long lCrc = 0;
            if ( aStr != null )
                byte [] strBytes = aStr.getBytes(); // Warning : encoding scheme dependant
                if ( strBytes != null )
                    lCrc = calculateBufferCRC ( strBytes, strBytes.length, 0xFFFFFFFFL );
            return ( lCrc );
        public static long updateCRC ( byte b, long lCrc )
            long temp1 = ( lCrc >> 8 ) & 0x00FFFFFFl;
            long temp2 = CRCTable [ ( (int) lCrc ^ b ) & 0xFF ];
            return ( temp1 ^ temp2 );
        public static long calculateBufferCRC ( byte []pcBuffer, int iCount, long lCrc )
            if ( iCount <= 0 )
                 return ( lCrc );
            int pcIndex = 0;
            long temp1, temp2;
            while (iCount-- != 0)
                temp1 = ( lCrc >> 8 ) & 0x00FFFFFFl;
                temp2 = CRCTable [ ( (int) lCrc ^ pcBuffer[pcIndex++] ) & 0xFF ];
                lCrc = temp1 ^ temp2;
          return ( lCrc );
        public static long calculateBufferCRC ( byte []pcBuffer, int offset, int iCount, long lCrc )
            if ( iCount <= 0 )
                 return ( lCrc );
            int pcIndex = offset;
            long temp1, temp2;
            while (iCount-- != 0)
                temp1 = ( lCrc >> 8 ) & 0x00FFFFFFl;
                temp2 = CRCTable [ ( (int) lCrc ^ pcBuffer[pcIndex++] ) & 0xFF ];
                lCrc = temp1 ^ temp2;
          return ( lCrc );
        public static void main ( String args[] )
            long crc;
            if ( args.length > 0 )
                for ( int i=0; i<args.length; i++ )
                    try
                        // crc = calcCRC32 ( args[i] );
                        crc = calcFileCRC32 ( args[i] );
                        System.out.println ( i + " : " + crc + " ( " + Long.toHexString(crc) + " ) : " + args[i] );
                    catch ( Exception e )
                        e.printStackTrace();
    }

  • White balance values for NEF file

    Hello
    I'm using Ligthroom 1.2 with nikon raw files (NEF) taken with D80 camera (firmware 1.01).
    The white balance is most of the time set to 5300K on my camera. But in LR, it always shows 4950K, with tint set to -1.
    I know there is some compatibility problems between LR and NEF files, but does anyone know if there is a solution (except of course applying systematically the right temperature in LR during import.)?
    Thanks
    Erwann

    I believe the issue is actually that Nikon encrypts the white balance in NEF files of recent cameras. If you
    google for it you'll find lots of links to all the problems this caused. Adobe and other RAW software developers have had to reengineer the encryption of this value in order to get a reasonable value and in order to not break the DMCA (draconian US anti-piracy law). Apparently, for some cameras the algorithm is not so good. The alternative for Adobe would have been to use Nikon's RAW rendering library, which for obvious reasons will never happen. The only software I am aware of that correctly reads NEF white balance for cameras after the D2X without using Nikon's libraries is dcraw/ufraw, but they can get away with cracking the encryption. This really is one of the dumber things Nikon ever did.

  • [CS3 INX] Values for file type in LnkI attribute?

    I've poked around in the C++ headers and all the relevant documentation I can find but can't find any definition of the possible values for the "file type" field in the Link Info array in CS3 INX files.
    Anyone know here I can find the list of possible values?
    I'm generating INX files with links and need to be able to generate the appropriate values based on the link targets.
    Thanks,
    Eliot

    Hi
    Pls tell me how did you resolve this issue. I am also facing the same problem.
    Thanks
    Prakash

  • Default setting for deleting files, default setting for deleting files

    How can I change default value for deleting files from 'Keep file' to 'In waste basket' (sorry this is a translation from the German 'In den Papierkorb') meaning delete file from harddisk definitely.

    Thank you for your quick answer. Yes that's the place I am doing the job. And yes I do get a choice to choose between 'Delete from hard disk' or 'Delete from iTunes only'.
    But the 'Delete from iTunes only' is the default setting. I would like to change the default setting to 'Delete permanently (from hard disk too)'. This would enable me to work much quicker by using keys only (Del + Enter) without touching the mouse to change from 'Delete from iTunes only' to 'Delete permanently'.

  • How to set threshold value for single tablespace in grid control 11g

    Hi,
    I want to set the threshold value for a single tablespace in grid control 11g,
    please provide me a navigation path.

    Sandy wrote:
    Can you please provide me the full navigation path?go to Targets --> Databases
    Select the Database you like to set this alert for
    Select link Metrics and Policy Settings

  • Possible to set the Created_by column value for File Browse?

    I'm using database account authentication and then a user gets to upload a file. Then my stored procedure reads the file. I was investigating a problem, happened to search the apex_workspace_files view and noticed the created_by column is always set to APEX_PUBLIC_USER for the files my users upload. Is there some way I can set that value when they log in, so I can track who actually did the upload?
    Thanks,
    Stew

    Dimitri,
    I was just using the standard File Browse item, so getting the blob from the workspace view. Though I've seen notes here about loading to your own table, what I had seemed to work (and was done) fairly well. There were just these little features I wanted to use...
    Thanks for the suggestion.
    Dave, I'm not sure which stored procedure you're suggesting I add the apex_custom_auth.get_username value to? I hoped that the internal File Browse routine would pick up the get_username value automatically instead of the application definition Public User value.
    Thanks,
    Stew

  • I am trying to export the combained PDF based on BOOK opetion using below scripts. but i am getting following error message "Invalid value for parameter 'to' of method 'exportFile'. Expected File, but received 1952403524". anyone knows, please suggest me

    Dear ALL,
    i am trying to export the combained PDF based on BOOK opetion using below scripts. but i am getting following error message "Invalid value for parameter 'to' of method 'exportFile'. Expected File, but received 1952403524". anyone knows, please suggest me solutions.
    var myBookFileName ,myBookFileName_temp;
                    if ( myFolder != null )
                            var myFiles = [];
                            var myAllFilesList = myFolder.getFiles("*.indd");    
                            for (var f = 0; f < myAllFilesList.length; f++)
                                        var myFile = myAllFilesList[f]; 
                                        myFiles.push(myFile);
                            if ( myFiles.length > 0 )
                                        myBookFileName = myFolder + "/"+ myFolder.name + ".indb";
                                        myBookFileName_temp=myFolder.name ;
                                        myBookFile = new File( myBookFileName );
                                        myBook = app.books.add( myBookFile );  
                                       myBook.automaticPagination = false;
                                        for ( i=0; i < myFiles.length; i++ )
                                                   myBook.bookContents.add( myFiles[i] );             
                                        var pdfFile =File(File(myFolder).fsName + "\\"+myBookFileName_temp+"_WEB.pdf");
                                        var bookComps = myBook.bookContents;
                                        if (bookComps.length === 1)
                                                       bookComps = [bookComps];
                                         var myPDFExportPreset = app.pdfExportPresets.item("AER6");
                                        app.activeBook.exportFile(ExportFormat.PDF_TYPE,File("D:\\AER\\WEBPDF.pdf"),false,myPDFEx portPreset,bookComps);
                                      //myBook.exportFile (ExportFormat.pdfType, pdfFile, false);
                                      //myBook.exportFile(pdfFile, false, pdfPref, bookComps);
                                        myBook.close(SaveOptions.yes);      

    Change the below line:
    app.activeBook.exportFile(ExportFormat.PDF_TYPE,File("D:\\AER\\WEBPDF.pdf"),false,myPDFExp ortPreset,bookComps);
    to
    app.activeBook.exportFile(ExportFormat.PDF_TYPE,File("D:\\AER\\WEBPDF.pdf"),false,myPDFExp ortPreset);
    Vandy

  • Default cell values for column not properly saved in uir file in labwindows 2009 (9.1.0 427)?

    I've run into a strange problem with the table control.  Basically, even though I set default cell values for a particular column as numeric, when I try to add items to the list it tries to add them as strings, and returns an error message that it is expecting *char instead of int.  Furthermore, when I open the uir file that contains the table in question in 2010, it appears as if the default cell values for that column are still set as strings, even though in 2009 when I open the uir file it shows as numbers.  I tried converting the uir to C code, and sure enough the C code indicated that the column still is a string type.
    I've gone ahead and made a small project to show the issue.  If you open this project in labwindows 2009 and click on the table in the table_bug.uir, and edit default cell values for column 1, you will see that the cell settings have type as numeric and data type as int.  When you run the project, however, it will fail with an error message saying that it is looking for a *char.  When this same project is loaded into labwindows 2010, clicking on the table in table_bug.uir and edit default cell values (column 1) shows the type as string.  When I change this to numeric (and change numeric attribute to int), this runs fine in 2010.  I tried simply changing the uir in 2010, and then using it in 2009, but 2009 complains that the uir is from a newer version (understandable).  If there is any workaround that would let me continue to use 2009 for the time that would be great.
    Any help would be greatly appreciated.
    thanks,
    Alex Corwin
    Solved!
    Go to Solution.
    Attachments:
    table_bug.zip ‏324 KB

    I opened the UIR in 2009 (but I have 2009 SP1) and it still showed that the default value for the first column was a string. I didn't have any problems changing it to a numeric int, and then building and running the project without error.
    Here are a few things you can try:
    1) Change the default value to a string. OK out of the dialog, re-enter the dialog, and change it back to Numeric int. Resave and see if the problem has gone away.
    2) You said you get a ".UIR is from a newer version" error when opening the 2010 UIR in 2009. Does the UIR still open if you click okay? Often times this will work just fine. Assuming you don't have any problems with this, make a minor change to the UIR in 2009, such as moving the table to the left, and then back to the right and then re-save. See if your program works now.
    Kevin B.
    National Instruments

  • Key Field Value for File Adapter Sender Wildcard

    Hello everybody, is there a way to use a wildcard for the property of NamA.keyFieldValue in the cc for a Fils Sender????, here's an example why, I need to take some Files via PI 7.0 and my key filed can change according to the file an example is that in a file can be HF28 and in another file can be HF29, the only character that is going to be constant is the H, I would really appreciate any suggestions, thanks in advance.
    Regards,
    Julio Cesar

    Hi Julio
    File sender adapter will read the file using FCC
    Key Field Value entry is mandatory if the key field name is set. Otherwise, the entry can be ignored
    http://help.sap.com/saphelp_nw70/helpdata/EN/34/393071e9b998438ddb8ce97cd617a1/frameset.htm
    Content Conversion ( The Key Field Problem )
    Thanks
    Gaurav Bhargava

  • Key field values for file content conversion at sender communicationchannel

    Hi all,
    I am working on  scenario File to Idoc.In this scenario at the sender side we configure the file content conversion for .CSV file.
    In the flat file i am having more than one orders, XI will pick the file and creates a separate Idoc for each sales order at r/3.
    We have similar fields "Order Item Number" both header and item.
    If we got '00000' on the Header item then it should create a new idoc at the r/3 side other wise it will create line item.
    Order item number will be 00000 for header but it will differ from order item to item.
    source structure
    Order Header
    Order type
    Sold-to-code
    PO number
    Order item number
    Order date
    AdresName1
    AdresName2
    Street and House number
    Postal Code
    City
    Country Code
    Filler
    Order Item
    Order type
    Sold-to-code
    PO  number
    Order item number
    Order date
    Product code
    Order quantity
    Item text
    Filler
    Now I have some questions....
    I don't have standard filed values to give the key fileds at source structure.
    I need to create the Idoc at r/3 system for each individual record.
    Is it possible to create new idoc with out using the Key filed value?
    If it is possible then how can i proceed with fcc?
    please give me your valuable suggestions
    Best Regards,
    satya
    Edited by: satyatanuku on Mar 3, 2010 1:41 PM
    Edited by: satyatanuku on Mar 3, 2010 1:42 PM
    Edited by: satyatanuku on Mar 3, 2010 1:44 PM

    Hi,
    Just check the Content Conversion Parameters-  Recordset Structure.
    If you have filled in this parameter with
    Header,1,Item,1
    then change it to
    Header,1,Item,3
    Regards
    Chandra

Maybe you are looking for

  • List of email clients/address books that will sync with palm desktop 6.2

    Is there (and if not, I think there should be) a list somewhere of email clients whose address books will sync with a palm Treo (palm Desktop version 6.2)? I have been happily using various palm handhelds for about 6 years now and have been syncing t

  • How do I transfer all data from my Macbook to my iMac?

    I want to transfer everything from my macbook to my imac anddelete everything presently on the imac. I also need to keep all data on the macbook. How do I do this please? The main reason for this is that the Mail app on my iMac no longer gives a wind

  • NEW TO THIS WORLD ... CAN I USE APPLE TV WITH IMAC 24"??

    IT MAY SEEM A SLY QUESTION BUT I'M NEW TO THIS KIND OF THINGS AND I CANNOT FIND THE ANSWER BY MYSELF, thank you for your help, caio mirco

  • New business scenerio

    Hello Guru, My client is in business of procuring new equipments which is given on lesae.Presently business wants to have another scenerio that is taking similar equipments from lease from some other vendor and then giving to lease to their client. 1

  • Can you help with syncing or backing up ipod

    I am a technobunny with an ipod on which I have my favourite music.   Today, I have downloaded the latest itunes version.   I would like not to lose my favourite music. How do I go about backing up that music onto my laptop?   I am not sure that if I