Structure Limitation 2004s

Hi,
I know that a query can only have two structures.  However, I am working with the Analyzer and was wondering if there was some way to have all of the structure that exist in a Cube show up  under the free characteristics area or somewhere in that navigation area.
An example of what i am trying to accomplish is that the cube has a structure for Income Statement and Balance sheet that we would like to be able to change the structure in the rows from one to the other.   There also is another structure going across the top that has a key figure structure.
Has anyone had this requirement and was able to create a report in a BI tool that allows more than 2 structures and possibly the abililty to change structures in a BI tool (e.g. Analyzer)?
Thanks
Kristen

Hi Kristen,
Unfortunately, you can have only 2 structures in a query at a time, and you cannot offer the flexibility like free chars to keep the structures as options that the user can exercise during analysis of the data...

Similar Messages

  • CASE structure limits - dynamic?

    Hello everybody
    Is this possible, and if, then how, to make the CASE structure
    condition dependent on variable in my program.
    Egsample: I have integer and the value of this is changing during
    program running, and I need from some reasons, to make the CASE
    structure sensitive for the values of this integer. How I can do
    this?
    I know there is always way around, but I am interested in this
    solution.
    thanks in advance
    regards
    PP

    Hi Pawel,
    I am starting a new thread because I think I know what you are asking for.
    First of all, you can not change the values of the case structure itself BUT,
    there is a way.
    What I suggest is the following;
    1) Code your second case as normal. Pretend it has 3 cases 0,1 &2.
    2) Code the first case (just example) such that is has two cases A&B.
    3)Have the first case state A return an array of (3)strings that is "Good", "Bad", and "Ugly".
    4)Have the first case state B return an array of (3) strings that is "Bad", "Ugly", and "Good".
    Feed the above mentioned arrays into a serach 1-d array function. Use the retuned index to select the proper case of "the second case".
    If you now pass "A" to the first case structure, and "Good" to the "se
    arch 1-d array" it will return an index of 0 and the second case will execute case "0".
    If you pass the first case "B" and again pass "Good" to the "serach...." it will return an index of 2 and the second case will execute state "2".
    So,
    by manipulating the arrays returned by the first case structure, you can "dynamically" influence which of the states of the second case is called.
    There are many variations on this idea that are possible. A similar approach would make it possible to select states based on wildcards and simulate "sparse enums".
    Did I nail it this time?
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Importing internal table from one program to another program

    Hi everybody,
    i have one small doubt.
    i am using submit statement and passing the values from this program to another program selection screen. in that program logic is written.In that program one internal table values are being exported to the memory id of that program. now i have to import that internal table values into my program by using import statement. i am using the following syntax
    import itab from menory id 'program name'.
    but i am getting an error saying program name is unknown.
    what is the exat syntax for this .
    thanking you,
    giri.

    hi,
    check these statements.
    IMPORT - Get data
    Variants:
    1. IMPORT obj1 ... objn FROM DATA BUFFER f.
    2. IMPORT obj1 ... objn FROM INTERNAL TABLE itab.
    2. IMPORT obj1 ... objn FROM MEMORY.
    3. IMPORT obj1 ... objn FROM SHARED MEMORY itab(ar) ID key.
    4. IMPORT obj1 ... objn FROM SHARED BUFFER itab(ar) ID key.
    5. IMPORT obj1 ... objn FROM DATABASE dbtab(ar) ID key.
    6. IMPORT obj1 ... objn FROM DATASET dsn(ar) ID key.
    7. IMPORT obj1 ... objn FROM LOGFILE ID key.
    8. IMPORT DIRECTORY INTO itab FROM DATABASE dbtab(ar) ID key.
    9. IMPORT (itab) FROM ... .
    In some cases, the syntax rules that apply to Unicode programs are different than those for non-Unicode programs. For more details, see Storing Cluster Tables.
    Variant 1
    IMPORT obj1 ... objn FROM DATA BUFFER f.
    Extras:
    1. ... = f (for each object to be imported)
    2. ... TO f (for each object to be imported)
    3. ... ACCEPTING PADDING
    4. ... ACCEPTING TRUNCATION
    5. ... IGNORING STRUCTURE BOUNDARIES
    6. ... IGNORING CONVERSION ERRORS
    7. ... REPLACEMENT CHARACTER c
    8. ... IN CHAR-TO-HEX MODE
    9. ... CODE PAGE INTO f1
    10. ... ENDIAN INTO f2
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas.
    See You Cannot Use Implicit Field Names in Clusters.
    Effect
    Imports the data objects obj1 ... objn from the data buffer declared. The data buffer must be of type XSTRING . The data objects obj1 ... objn can be fields, structures, complex structures, or tables. The system imports all the data that has been stored in the data buffer f using the EXPORT ... TO DATA BUFFER statement and is listed here. It also checks that the structure used in the IMPORT statement matches the one in the EXPORT statement.
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged. (In some circumstances, this may mean that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported. The contents of all the objects remain unchanged.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is stored in the field f.
    Addition 3
    ... ACCEPTING PADDING
    Effect
    This addition allows you to append new fields to the end
    of structures, sub-structures, and internal tables. The IMPORT statement fills the additional fields with initial values; make existing fields (C, N, X, P, I1, and I2) longer; map character-type fields to STRING-type fields; or to map byte-type fields to XSTRING-type fields.
    Addition 4
    ... ACCEPTING TRUNCATION
    Effect
    This addition allows you to shorten the last CHAR
    fields, or to omit the last component at the top level. (Until Release 4.6, you could do this without using an addition).
    Addition 5
    ... IGNORING STRUCTURE BOUNDARIES
    Effect
    This addition means that only the fragment sequence is
    relevant - that is, that any sub-structures match. If you use this addition, the system ignores any alignment changes necessitated by Unicode - such as inserting named includes.
    You cannot use this addition with either addition 3 (enlarge structure) or addition 4 (shorten structure), since it specifies that structure and include boundaries are to be ignored.
    From Release 6.10 onwards, the include information is stored in datasets, so that the system can also check that includes match - that is, that sub-structures and includes (named or unnamed) are treated equally. When data is imported in a Release prior to 6.10, includes are not checked.
    Addition 6
    ...IGNORING CONVERSION ERRORS
    Effect
    This addition prevents the system from triggering a
    runtime error, if an error occurs when the character set is converted. '#' is used as a replacement character.
    Addition 7
    ... REPLACEMENT CHARACTER c
    Effect
    The replacement character is used if a particular
    character cannot be converted when the character set is converted.
    This addition can only be used in conjunction with addition 6.
    Addition 8
    ... IN CHAR-TO-HEX MODE
    Effect
    Not all character-type fields are converted. To convert
    a field, you must create a field (or structure) that is identical to the exported field or structure, except that all its character-type components must be replaced with hexadecimal fields.
    You can only use this addition in Unicode programs, to allow you to import camouflaged binary data as single-byte characters.
    Moreover, you cannot use this addition in conjunction with the additions 3, 4, 5, 6, or 7.
    Addition 9
    ... CODE PAGE INTO f1
    Effect
    The code page of the exported data is stored in the
    character-type field f1 - for example, to analyze data that has been imported with the IN CHAR-TO-HEX MODE addition.
    Addition 10
    ... ENDIAN INTO f2
    Effect
    The byte order (LITTLE or BIG) of the
    exported data is stored in the field f2 - for example, to analyze data that has been imported with the IN CHAR-TO-HEX MODE addition. The field f2 must have the type ABAP_ENDIAN, which is defined in the type group ABAP. For this reason, the type group ABAP must be included in the ABAP program using a TYPE-POOLS statement.
    Variant 2
    IMPORT obj1 ... objn FROM INTERNAL TABLE itab.
    Extras:
    1. ... = f (for each object to be imported)
    2. ... TO f (for each object to be imported)
    3. ... ACCEPTING PADDING
    4. ... ACCEPTING TRUNCATION
    5. ... IGNORING STRUCTURE BOUNDARIES
    6. ... IGNORING CONVERSION ERRORS
    7. ... REPLACEMENT CHARACTER c
    8. ... IN CHAR-TO-HEX MODE
    9. ... CODE PAGE INTO f1
    10. ... ENDIAN INTO f2
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas. See No implicit field names in cluster.
    Effect
    Imports the data objects obj1 ... objn (fields, structures, complex structures, or tables) from the specified internal table itab. The first column in the internal table must be of the predefined type INT2 and the second must be type X. To define the first column you must refer to a data element in the ABAP Dictionary that has the predefined type INT2.
    All data that was stored in the internal table itab using EXPORT ... TO INTERNAL TABLE and listed, is imported. The system checks that the EXPORT and IMPORT structures match.
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the specified data cluster were imported, the rest remain unchanged (it is possible that no data object was imported).
    SY-SUBRC = 4:
    The data objects could not be imported.
    The contents of all listed objects remain unchanged
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    Places the object in the field f.
    Addition 3
    ... ACCEPTING PADDING
    Effect
    This addition allows you to add new fields to the ends
    of structures, even to substructures and internal tables (the additional fields are filled with initial value during the IMPORT). It also allows you to increase the size of existing fields (C, N, X, P, I1, and I2) and to map Char fields to STRING type fields or byte fields to XSTRING type fields.
    Addition 4
    ... ACCEPTING TRUNCATION
    Effect
    This addition allows you to shorten the last CHAR
    field or omit the last component on the highest level (till Release 4.6 this was possible without specifying an addition).
    Addition 5
    ... IGNORING STRUCTURE BOUNDARIES
    Effect
    This addition means that only the page order is
    relevant, that is any substructures match. With this addition, the system also ignores alignment changes arising from the Unicode conversion (for example, due to subsequent insertion of named includes).
    This addition rules out any subsequent structural enhancements (addition 3) or structural shortening (addition 4) because with this addition it is the structural limits and include limits that are to be ignored.
    As from Release 6.10, the include information will also be stored in the dataset, so that it is possible to also check whether the includes match, that is substructures and includes (named or unnamed) are treated the same. When importing data that was exported in a Release lower than 6.10, the includes are not checked.
    Addition 6
    ...IGNORING CONVERSION ERRORS
    Effect
    This addition has the effect that an error in the
    character set conversion does not cause a runtime error. The system uses "#" as a replacement character.
    Addition 7
    ... REPLACEMENT CHARACTER c
    Effect
    The system uses the specified replacement character if a
    character cannot be converted during a character set conversion. If this addition is not specified, the system uses "#" as a replacement character.
    This addition can only be used in conjunction with addition 6.
    Addition 8
    ... IN CHAR-TO-HEX MODE
    Effect
    No character type fields are converted. For this you
    must create a field or structure that is identical to the exported field or exported structure, except that all character type fields must be replaced with hexadecimal fields.
    This addition, which is only allowed in programs with a set Unicode flag, allows you to import binary data disguised as single byte characters. This addition cannot be used in conjunction with additions 3, 4, 5, 6, and 7.
    Addition 9
    ... CODE PAGE INTO f1
    Effect
    The codepage of the exported data is stored in the
    character-type field f1 (for example, to be able to analyze the data imported with the addition IN CHAR-TO-HEX MODE).
    Addition 10
    ... ENDIAN INTO f2
    Effect
    The byte order (LITTLE or BIG) of the
    exported data is stored in the field f2 (for example, to be able analyze the data imported using the addition IN CHAR-TO-HEX MODE). The field f2 must be of type ABAP_ENDIAN, defined in type group ABAP. You must therefore include the type group ABAP in the ABAP program with a TYPE-POOLS statement.
    Variant 3
    IMPORT obj1 ... objn FROM MEMORY.
    Extras:
    1. ... = f (for each object to be imported) 2. ... TO f (for each object to be imported)
    3. ... ID key
    4. ... ACCEPTING PADDING
    5. ... ACCEPTING TRUNCATION
    6. ... IGNORING STRUCTURE BOUNDARIES
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas. See You Must Enter Identification and Cannot Use Implicit Field Names inClusters
    Effect
    Imports data objects obj1 ... objn (fields, structures, complex structures or tables) from a data cluster in the ABAP memory (see EXPORT). Reads in all data without an ID that was exported to memory with "EXPORT ... TO MEMORY.". In contrast to the variant IMPORT FROM DATABASE, it does not check that the structure matches in EXPORT and IMPORT.
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged (in some circumstances, this may mean that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported, probably because the ABAP memory was empty.
    The contents of all objects remain unchanged.
    Note
    You should always use the addition 3 (... ID key) with the statement. Otherwise, the effect of the variant is not certain (EXPORT statements in different parts of a program overwrite each other in the ABAP memory), since it exists only for reasons of compatibility with R/2.
    Additional methods for selecting and deleting data clusters in the ABAP memory are provided by the system class CL_ABAP_EXPIMP_MEM.
    Please consult Data Area and Modularization Unit Organization documentation as well.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is placed in field f.
    Addition 3
    ... ID key
    Effect
    Imports only data stored in ABAP memory under the ID key.
    Notes
    The key, key, must be a character-type data object (but not a string).
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged (in some circumstances, this may mean that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported, probably because an incorrect ID was used.
    The contents of all objects remain unchanged.
    Addition 4
    ... ACCEPTING PADDING
    Effect
    This addition allows you to append new fields to the end of structures, sub-structures, and internal tables. The IMPORT statement fills the additional fields with initial values; make existing fields (C, N, X, P, I1, and I2) longer; map character-type fields to STRING-type fields; or to map byte-type fields to XSTRING-type fields.
    Addition 5
    ... ACCEPTING TRUNCATION
    Effect
    This addition allows you to shorten the last CHAR field, or to omit the last component at the top level. (Until Release 4.6, you could do this without using an addition).
    Addition 6
    ... IGNORING STRUCTURE BOUNDARIES
    Effect
    This addition means that only the fragment sequence is relevant - that is, that any sub-structures match. If you use this addition, the system ignores any alignment changes necessitated by Unicode - such as inserting named includes.
    You cannot use this addition with either addition 3 (enlarge structure) or addition 4 (shorten structure), since it specifies that structure and include boundaries are to be ignored.
    From Release 6.10 onwards, the include information is stored in datasets, so that the system can also check that includes match - that is, that sub-structures and includes (named or unnamed) are treated equally. When data is imported in a Release prior to 6.10, includes are not checked.
    Related
    EXPORT TO MEMORY, DELETE FROM MEMORY, FREE MEMORY
    Variant 4
    IMPORT obj1 ... objn FROM SHARED MEMORY itab(ar) ID key.
    Extras:
    1. ... = f (for each object to be exported) 2. ... TO f (for each object to be exported)
    3. ... CLIENT g (before ID key)
    4. ... TO wa (after itab(ar) or ID key )
    5. ... ACCEPTING PADDING
    6. ... ACCEPTING TRUNCATION
    7. ... IGNORING STRUCTURE BOUNDARIES
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas.
    See You Cannot Use Implicit Field Names in Clusters and You Cannot Use Table Work Areas.
    Effect
    Imports the data objects obj1 ... objn (fields, structures, complex structures, or tables) from shared memory. The data objects are read using the ID key from the area ar in the table itab - c.f. EXPORT TO SHARED MEMORY). You must use itab to specify a database table although the system reads from a memory table with the appropriate structure.
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged. (In some circumstances, this may mean that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported. You may have used the wrong ID. The contents of all the objects remain unchanged.
    Notes
    The table dbtab named according to SHARED MEMORY must be declared using TABLES (except in addition 2).
    The structure of fields (field symbols and internal tables) to be imported must match the structure of the objects exported in the dataset. The objects must be imported under the same names as those under which they were exported. Otherwise, they will not be imported.
    The key length consists of: the client (3 digits, but only if tab is client-specific); area (2 characters); ID; and line number (4 bytes). It must not exceed 64 bytes - that is, the ID must not be longer than 55 characters, if the table is client- specific.
    The key, key, must be a character-type data object (but not a string).
    Additional methods for selecting and deleting data clusters in the shared memory are provided by the system class CL_ABAP_EXPIMP_SHMEM.
    Please consult Data Area and Modularization Unit Organization documentation as well.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is stored in the field f.
    Addition 3
    ... CLIENT g (before ID key)
    Effect
    The data is imported from client g (provided the import/export table is tab client-specific). The client, g must be a character-type data object (but not a string).
    Addition 4
    ... TO wa (after itab(ar) or ID key)
    Effect
    You need to use this addition if user data fields have been stored in the application buffer and are to be read from there. The work area wa is used instead of the table work area. The target area must correspond to the structure of the called table tab.
    Addition 5
    ... ACCEPTING PADDING
    Effect
    This addition allows you to: append new fields to the end of structures, sub-structures, and internal tables. The IMPORT statement fills the additional fields with initial values; make existing fields (C, N, X, P, I1, and I2) longer; map character-type fields to STRING-type fields; or to map byte-type fields to XSTRING-type fields.
    Addition 6
    ... ACCEPTING TRUNCATION
    Effect
    This addition allows you to shorten the last CHAR fields, or to omit the last component at the top level. (Until Release 4.6, you could do this without using an addition).
    Addition 7
    ... IGNORING STRUCTURE BOUNDARIES
    Effect
    This addition means that only the fragment sequence is relevant - that is, that any sub-structures match. If you use this addition, the system ignores any alignment changes necessitated by Unicode - such as inserting named includes.
    You cannot use this addition with either addition 4 (enlarge structure) or addition 5 (shorten structure), since it specifies that structure and include boundaries are to be ignored.
    From Release 6.10 onwards, the include information is stored in datasets, so that the system can also check that includes match - that is, that sub-structures and includes (named or unnamed) are treated equally. When data is imported in a Release prior to 6.10, includes are not checked.
    Related
    EXPORT TO SHARED MEMORY, DELETE FROM SHARED MEMORY
    Variant 5
    IMPORT obj1 ... objn FROM SHARED BUFFER itab(ar) ID key.
    Extras:
    1. ... = f (for each object to be exported) 2. ... TO f (for each object to be exported)
    3. ... CLIENT g (before ID key)
    4. ... TO wa (last addition or after itab(ar))
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas.
    See Cannot Use Implicit Fieldnames in Clusters und Cannot Use Table Work Areas.
    Effect
    Imports data objects obj1 ... objn (fields or
    tables) from the cross-transaction application buffer. The data objects are read in the application buffer using the ID key of the area ar of the buffer area for the table itab (see EXPORT TO SHARED BUFFER). You must use dbtab to specify a database table although the system reads from a memory table with an appropriate structure.
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged (in some circumstances, this means that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported, probably because an incorrect ID was used.
    The contents of all objects remain unchanged.
    Example
    Import two fields and an internal table from the application buffer with the structure INDX:
    TYPES: BEGIN OF ITAB3_LINE,
             CONT(4),
           END OF ITAB3_LINE.
    DATA: INDXKEY LIKE INDX-SRTFD VALUE 'KEYVALUE',
          F1(4),
          F2(8) TYPE P DECIMALS 0,
          ITAB3 TYPE STANDARD TABLE OF ITAB3_LINE,
          INDX_WA TYPE INDX.
    Import data.
    IMPORT F1 = F1 F2 = F2 ITAB3 = ITAB3
           FROM SHARED BUFFER INDX(ST) ID INDXKEY TO INDX_WA.
    After import, the data fields INDX-AEDAT and
    INDX-USERA in front of CLUSTR are filled with
    the values in the fields before the EXPORT
    statement.
    Notes
    You must declare the table dbtab, named after DATABASE using a TABLES statement.
    The structure of the fields, structures, and internal tables to be imported must match the structure of the objects exported to the dataset. Moreover, the objects must be imported with the same name used to export them. Otherwise, the import is not performed.
    The maximum total key length is 64 bytes. It must include: a client if the table is client-specific (3 characters); an area (2 characters); identification; and line counter (4 bytes). This means that the number of characters available for the identification of a client-specific table is 55 characters.
    The key, key, must be a character-type data object (but not a string).
    Additional methods for selecting and deleting data clusters in the cross-transaction application buffer are provided by the system class CL_ABAP_EXPIMP_SHBUF.
    Please consult Data Area and Modularization Unit Organization documentation as well.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is placed in the field f
    Addition 3
    ... CLIENT g (after dbtab(ar))
    Effect
    Takes the data from the client g (if the import/export table dbtab is client-specific). The client g must be a character-type data object (but not a string).
    Addition 4
    ... TO wa (as the last addition or after itab(ar))
    Effect
    You need to use this addition if you want to save user data fields in the application buffer and then read them from there later. The system uses a work area wa instead of a table work area. The target area must have the same structure as the table tab.
    Example
    DATA: INDX_WA TYPE INDX,
          F1.
    IMPORT F1 = F1 FROM SHARED BUFFER INDX(AR)
                   CLIENT '001' ID 'TEST'
                   TO INDX_WA.
    WRITE: / 'AEDAT:', INDX_WA-AEDAT,
           / 'USERA:', INDX_WA-USERA,
           / 'PGMID:', INDX_WA-PGMID.
    Variant 6
    IMPORT obj1 ... objn FROM DATABASE dbtab(ar) ID key.
    Extras:
    1. ... = f (for each object to be imported)
    2. ... TO f (for each object to be imported)
    3. ... CLIENT g (before ID key )
    4. ... USING form
    5. ... TO wa (last addition or after dbtab(ar))
    6. ... MAJOR-ID id1 (instead of ID key)
    7. ... MINOR-ID id2 (with MAJOR-ID id1 )
    8. ... ACCEPTING PADDING
    9. ... ACCEPTING TRUNCATION
    10. ... IGNORING STRUCTURE BOUNDARIES
    11. ... IGNORING CONVERSION ERRORS
    12. ... REPLACEMENT CHARACTER c
    13. ... IN CHAR-TO-HEX MODE
    14. ... CODE PAGE INTO f1
    15. ... ENDIAN INTO f2
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas. See Cannot Use Implicit Fieldnames in Clusters and Cannot Use Table Work Areas.
    Effect
    Imports data objects obj1 ... objn (fields, structures, complex structures, or tables) from the data cluster with ID key in area ar of the database table dbtab (see EXPORT TO DATABASE).
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged (in some circumstances, this may mean that not data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported, probably because an incorrect ID was used.
    The contents of all objects remain unchanged.
    Example
    Import two fields and an internal table:
    TYPES: BEGIN OF TAB3_TYPE,
              CONT(4),
           END OF TAB3_TYPE.
    DATA: INDXKEY LIKE INDX-SRTFD,
          F1(4), F2 TYPE P,
          TAB3 TYPE STANDARD TABLE OF TAB3_TYPE WITH
                    NON-UNIQUE DEFAULT KEY,
          WA_INDX TYPE INDX.
    INDXKEY = 'INDXKEY'.
    IMPORT F1   = F1
           F2   = F2
           TAB3 = TAB3 FROM DATABASE INDX(ST) ID INDXKEY
           TO WA_INDX.
    Notes
    You must declare the table dbtab, named after DATABASE, using the TABLES statement (except in addition 5).
    The structure of fields, field strings and internal tables to be imported must match the structure of the objects exported to the dataset. In addition, the objects must be imported under the same name used to export them. If this is not the case, either a runtime error occurs or no import takes place.
    Exception: You can lengthen or shorten the last field if it is of type CHAR, or add/omit CHAR fields at the end of the structure.
    The key, key, must be a character-type data object (but not a string).
    Additional methods for selecting and deleting data clusters in the database table specified are provided by the system class CL_ABAP_EXPIMP_DB.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is placed in field f.
    Addition 3
    ... CLIENT g (before the ID key)
    Effect
    Data is taken from the client g (in client-specific import/export databases only). Client g must be a character-type data object (but not a string).
    Example
    DATA: F1,
          WA_INDX TYPE INDX.
    IMPORT F1 = F1 FROM DATABASE INDX(AR) CLIENT '002' ID 'TEST'
                   TO WA_INDX.
    Addition 4
    ... USING form
    Note
    This statement is for internal use only.
    Incompatible changes or further developments may occur at any time without warning or notice.
    Effect
    Does not read the data from the database. Instead, calls the FORM routine form for each record read from the database without this addition. This routine can take the data key of the data to be retrieved from the database table work area and write the retrieved data to this work area. The name of the routine has the format <name of database table>_<name of form>; it has one parameter which describes the operation (READ, UPDATE or INSERT). The routine must set the field SY-SUBRC in order to show whether the function was successfully performed.
    Addition 5
    ... TO wa (after key or after dbtab(ar))
    Effect
    You need to use this addition if you want to save user data fields in the cluster database and then read from there. The system uses the work area wa instead of a table work area. The target area entered must have the same structure as the table dbtab.
    Example
    DATA WA LIKE INDX.
    DATA F1.
    IMPORT F1 = F1 FROM DATABASE INDX(AR)
                   CLIENT '002' ID 'TEST'
                   TO WA.
    WRITE: / 'AEDAT:', WA-AEDAT,
           / 'USERA:', WA-USERA,
           / 'PGMID:', WA-PGMID.
    Addition 6
    ... MAJOR-ID id1 (instead of the ID key).
    Addition 7
    ... MINOR-ID id2 (with MAJOR-ID id1)
    This addition is not allowed in an ABAP Objects context. See Cannot Use Generic Identification.
    Effect
    Searches for a record the first part of whose ID (length of id1) matches id1 and whose second part - if MINOR-ID id2 is also declared - is greater than or equal to id2.
    Addition 8
    ... ACCEPTING PADDING
    Effect
    This addition allows you to append new fields to the end of structures, sub-structures, and internal tables. The IMPORT statement fills the additional fields with initial values; make existing fields (C, N, X, P, I1, and I2) longer; map character-type fields to STRING-type fields; or to map byte-type fields to XSTRING-type fields.
    Addition 9
    ... ACCEPTING TRUNCATION
    Effect
    This addition allows you to shorten the last CHAR fields, or to omit the last component at the top level. (Until Release 4.6, you could do this without using an addition).
    Addition 10
    ... IGNORING STRUCTURE BOUNDARIES
    Effect
    This addition means that only the fragment sequence is relevant - that is, that any sub-structures match. If you use this addition, the system ignores any alignment changes necessitated by Unicode - such as inserting named includes.
    You cannot use this addition with either addition 8 (enlarge structure) or addition 9 (shorten structure), since it specifies that structure and include boundaries are to be ignored.
    From Release 6.10 onwards, the include information is stored in datasets, so that the system can also check that includes match - that is, that sub-structures and includes (named or unnamed) are treated equally. When data is imported in a Release prior to 6.10, includes are not checked.
    Addition 11
    ...IGNORING CONVERSION ERRORS
    Effect
    This addition prevents the system from triggering a runtime error, if an error occurs when the character set is converted. '#' is used as a replacement character.
    Addition 12
    ... REPLACEMENT CHARACTER c
    Effect
    The replacement character is used if a particular character cannot be converted when the character set is converted. If you do not use this addition, '#' is used as a replacement character.
    This addition can only be used in conjunction with addition 11.
    Addition 13
    ... IN CHAR-TO-HEX MODE
    Effect
    All character-type fields are not converted. To convert a field, you must create a field (or structure) that is identical to the exported field or structure, except that all its character-type components must be replaced with hexadecimal fields.
    You can only use this addition in Unicode programs, to allow you to import camouflaged binary data as single-byte characters. Moreover, you cannot use this addition in conjunction with the additions 8, 9, 10, 11, and 12.
    Addition 14
    ... CODE PAGE INTO f1
    Effect
    The code page of the exported data is stored in the character-type field f1 - for example, to analyze data that has been imported with the IN CHAR-TO-HEX MODE addition.
    Addition 15
    ... ENDIAN INTO f2
    Effect
    The byte order(LITTLE or BIG) of the exported data is stored in the field f2 - for example, to analyze data that has been imported with the IN CHAR-TO-HEX MODE addition. The field f2 must have the type ABAP_ENDIAN, which is defined in the type group ABAP. For this reason, the type group ABAP must be included in the ABAP program using a TYPE-POOLS statement.
    Variant 7
    IMPORT obj1 ... objn FROM DATASET dsn(ar) ID key.
    This variant is not allowed in an ABAP Objects context. See Cannot Use Clusters in Files
    Note
    This variant is no longer supported and cannot be used.
    Variant 8
    IMPORT obj1 ... objn FROM LOGFILE ID key.
    Note
    This statement is for internal use only.
    Incompatible changes or further developments may occur at any time without warning or notice.
    Extras:
    1. ... = f (for each field f to be imported) 2. ... TO f (for each field f to be imported)
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas. See Cannot Use Implicit Field Names in Clusters
    Effect
    Imports data objects (fields, field strings or internal tables) from the update data. You must specify the update key assigned by the system (with current request number) as the key.
    The key, key, must be a character-type data object (but not a string).
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The existing data objects in the data cluster specified were imported. The rest remain unchanged (in some circumstances, this may mean that no data objects were imported).
    SY-SUBRC = 4:
    The data objects could not be imported. An incorrect ID may have been used.
    The contents of all objects remain unchanged.
    Addition 1
    ... = f (for each object to be imported)
    Addition 2
    ... TO f (for each object to be imported)
    Effect
    The object is placed in field f.
    Variant 9
    IMPORT DIRECTORY INTO itab FROM DATABASE dbtab(ar) ID key.
    Extras:
    1. ... CLIENT g (after dbtab(ar)) 2. ... TO wa (last addition or after dbtab(ar))
    The syntax check performed in an ABAP Objects context is stricter than in other ABAP areas. See Cannot Use Table Work Areas.
    Effect
    Imports an object directory stored under the specified ID with EXPORT TO DATABASE into the table itab. The internal table itab may not have the type HASHED TABLE or ANY TABLE.
    The key, key, must be a character-type data object (but not a string).
    The Return Code is set as follows:
    SY-SUBRC = 0:
    The directory was successfully imported.
    SY-SUBRC = 4:
    The directory could not be imported, probably because an incorrect ID was used.
    The internal table itab must have the same structure as the Dictionary structure CDIR (INCLUDE STRUCTURE).
    Addition 1
    ... CLIENT g (before ID key)
    Effect
    Takes data from the client g (only with client-specific import/export databases). Client g must be a character-type data object (but not a string).
    Addition 2
    ... TO wa (last addition or after dbtab(ar))
    Effect
    Uses the work area wa instead of the table work area. When you use this addition, you do not need to declare the table dbtab, named after DATABASE using a TABLES statement. The work area entered must have the same structure as the table dbtab.
    Example
    Directory of a cluster consisting of two fields and an internal table:
    TYPES: BEGIN OF TAB3_LINE,
             CONT(4),
           END OF TAB3_LINE,
           BEGIN OF DIRTAB_LINE.
             INCLUDE STRUCTURE CDIR.
    TYPES  END OF DIRTAB_LINE.
    DATA: INDXKEY LIKE INDX-SRTFD,
          F1(4),
          F2(8)   TYPE P decimals 0,
          TAB3    TYPE STANDARD TABLE OF TAB3_LINE,
          DIRTAB  TYPE STANDARD TABLE OF DIRTAB_LINE,
          INDX_WA TYPE INDX.
    INDXKEY = 'INDXKEY'.
    EXPORT F1 = F1
           F2 = F2
           TAB3 = TAB3
           TO DATABASE INDX(ST) ID INDXKEY " TAB3 has 17 entries
           FROM INDX_WA.
    IMPORT DIRECTORY INTO DIRTAB FROM DATABASE INDX(ST) ID INDXKEY
           TO INDX_WA.
    Then, the table DIRTAB contains the following:
    NAME     OTYPE  FTYPE  TFILL  FLENG
    F1         F      C      0      4
    F2         F      P      0      8
    TAB3       T      C      17     4
    The meaning of the individual fields is as follows:
    NAME:
    Name of stored object
    OTYPE:
    Object type (F: Field, R: Field string / Dictionary struc

  • LSMW me51n, how to create one PR document for all the records in the  file

    HI all,
    I need to create LSMW for t-code me51n -Create Purchase Requisition. I`m using Bapi BUS2105, method CREATEFROMDATA, idoc message type PREQCR, basic type PREQCR03. The problem is that the LSMW is creating different idoc and different PR document for every record in the source file. My requirement is to create one PR document for one source file (Every source file is different Purchase Requisition) . I`m trying to do this with writing some code(global functions ) in the 'Mapping and conversion rules'  events - BEGINOF_TRANSACTION_, ENDOF_TRANSACTION__..., but i`m not very sure what exactly i`m doing .
    Please help me resolve this problem, any help will be appreciated .
    Best regards, Emil Milchev.

    Thank you for you answer.
    But I have found faster way of doing it - two source structures, one HEADER and ONE ITEM.
    HEADER: one empty text field and identificator for it.
    ITEM: everything else.
    Then everything was just fine, i`ve mapped the different IDOC segments by PREQ_ITEM fields (equal values in the source file : 10-10-10..., 20-20-20,.... etc.) and put all required fields for my LSMW
    SOURCE FIELDS:
    Z_ME51N_V2 - MASS_UPLOAD - CREATE create
    Source Fields
    UPFILE                    upload file
                IDENT                          C(010)    ident
                                               Identifing Field Content: header
                TEXT                           C(001)
                UPFILE2                   123
                    IDENT                          C(010)    ident
                                                   Identifing Field Content: item
                    BSART                          C(004)    Document type
                    BANFN                          C(010)    Purchase requisition number
                    BNFPO_FOR_MAP                  N(005)    Item number of purchase req. for MAPPING acc.
                    BNFPO                          N(005)    Item number of purchase requisition
                    KNTTP                          C(001)    Account assignment category
                    PSTYP                          C(001)    Item category in purchasing document
                    MATNR                          C(018)    Material Number
                    WERKS                          C(004)    Plant
                    LGORT                          C(004)    Storage Location
                    MENGE                          N(013)    Purchase requisition quantity
                    EKGRP                          C(003)    Purchasing group
                    KONNR                          C(010)    Number of principal purchase agreement
                    KTPNR                          N(005)    Item number of principal purchase agreement
                    LIFNR                          C(010)    Desired Vendor
                    FLIEF                          C(010)    Fixed Vendor
                    AFNAM                          C(012)    Name of requisitioner/requester
                    PREIS                          AMT4(011) Price in purchase requisition
                    ABLAD                          C(025)    Unloading Point
                    WEMPF                          C(012)    Goods Recipient
                    PS_POSID                       C(024)    Work Breakdown Structure Element (WBS Element)
                    KOSTL                          C(011)    COST_CTR v bapito ?
                    NAME1                          C(040)    Name1 - Name of an address
                    NAME2                          C(040)    Name2 - Name of an address 2
                    STREET                         C(060)    Street
                    DELIVERY_DATE                  C(008)    Date on which the goods are to be delivered
                    TEXT                           C(132)    item text
    STRUCTURE RELATIONS :
    Structure Relations
    E1PREQCR              Header segment                                               <<<< UPFILE  upload file
               E1BPEBANC             Transfer Structure: Create Requisition Item                  <<<< UPFILE2 123
               E1BPEBKN              Transfer Structure: Create/Display Requisition Acct Assgt    <<<< UPFILE2 123
               E1BPEBANTX            BAPI Purchase Requisition: Item Text                         <<<< UPFILE2 123
               E1BPESUHC             Communication Structure: Limits                              <<<< UPFILE2 123
               E1BPESUCC             Communication Structure: Contract Limits                     <<<< UPFILE2 123
               E1BPESLLC             Communication Structure: Create Service Line                 <<<< UPFILE2 123
               E1BPESKLC             Create Comm. Structure: Acct Assgt Distr. for Service Line   <<<< UPFILE2 123
               E1BPESLLTX            BAPI Services Long Text                                      <<<< UPFILE  upload file
               E1BPMERQADDRDELIVERY  PO Item: Address Structure BAPIADDR1 for Inbound Delivery    <<<< UPFILE2 123
                   E1BPMERQADDRDELIVERY1 PO Item: Address Structure BAPIADDR1 for Inbound Delivery    <<<< UPFILE2 123
               E1BPPAREX             Ref. Structure for BAPI Parameter EXTENSIONIN/EXTENSIONOUT   <<<< UPFILE2 123
    MAINTAIN FIELD MAPPING AND... :
    the MAPPING between two IDOC`s segments:
    In first segment:
    E1BPEBANC                      Transfer Structure: Create Requisition Item
             Fields
                 PREQ_NO                      Purchase requisition number
                                     Source:  UPFILE2-BANFN (Purchase requisition number)
                                     Rule :   Transfer (MOVE)
                                     Code:    E1BPEBANC-PREQ_NO = UPFILE2-BANFN.
                 PREQ_ITEM                    Item number of purchase requisition
                                     Source:  UPFILE2-BNFPO (Item number of purchase requisition)
                                     Rule :   Transfer (MOVE)
                                     Code:    E1BPEBANC-PREQ_ITEM = UPFILE2-BNFPO.
    In second segment :
    E1BPEBKN                       Transfer Structure: Create/Display Requisition Acct Assgt
               Fields
                   PREQ_NO                      Purchase requisition number
                   PREQ_ITEM                    Item number of purchase requisition
                                       Source:  UPFILE2-BNFPO_FOR_MAP (Item number of purchase req. for MAPPING
                                       Rule :   Transfer (MOVE)
                                       Code:    E1BPEBKN-PREQ_ITEM = UPFILE2-BNFPO_FOR_MAP.
    After that everything was OK .

  • Wlst ofline writeDomain error

    Hi,
    I am using wlst offline downloaded from dev2dev site, which i believe is based on jython. Anyway, i am trying to create a domain. All is well untill i try to write the domain using writeDomain(). It fails with the following error
    "java.io.IOException: Unable to resolve input source.
    Error: writeDomain() failed.
    Traceback (innermost last):
    File "wls.py", line 9, in ?
    File "initWls.py", line 70, in writeDomain
    com.bea.plateng.domain.script.jython.WLSTException: java.lang.NullPointerException
    at com.bea.plateng.domain.script.jython.CommandExceptionHandler.handleException(CommandExceptionHandler.java:33)
    at com.bea.plateng.domain.script.jython.WLScriptContext.handleException(WLScriptContext.java:890)
    at com.bea.plateng.domain.script.jython.WLScriptContext.writeDomain(WLScriptContext.java:459)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
    at org.python.core.PyMethod.__call__(PyMethod.java)
    at org.python.core.PyObject.__call__(PyObject.java)
    at org.python.core.PyInstance.invoke(PyInstance.java)
    I have a valid directory and can write perfectly into this. Any ideas, why this failing to write the domain.
    Thx,
    Ravi

    Hello,
    I believe you did not set up your classpath according to the doc.
    Specifically I think you did not put @WL_HOME/server/lib in classpath.
    The script runs fine on my machine. Please follow the readme.txt,
    especially step 3 and step 4:
    3. Extract the following files from the WLST offline configuration kit:
    NOTE: <WL_HOME> refers to the root directory of your WebLogic
    Platform installation.
    By default, the pathname for this directory is c:\bea\weblogic81.
    o WLST JAR files, including config.jar, comdev.jar, and 3rdparty.jar, to
    <WL_HOME>\common\lib.
    NOTE: It is recommended that you back up the existing JAR files.
    For version compatibility,
    they may have to be used when you use non-WLST related Weblogic
    features.
    o runWLSTOffline.cmd and runWLSTOffline.sh script files to
    <WL_HOME>\common\bin.
    o (Optional) Sample script files to the desired location.
    4. Update the CLASSPATH environment variable to include the following
    WebLogic Server,
    Jython, and WLST files and directories:
    NOTE: <JYTHON_HOME> refers to the root directory of your Jython
    installation.
    <WL_HOME>\server\lib
    <WL_HOME>\server\lib\weblogic.jar
    <JYTHON_HOME>\jython.jar
    <WL_HOME>\common\lib\config.jar
    <WL_HOME>\common\lib\comdev.jar
    <WL_HOME>\common\lib\3rdparty.jar
    Thanks,
    -satya
    Web Team wrote:
    Hi The log is as follows
    "========================================================
    << read template from "/opt/was/bea/weblogic81/common/templates/domains/wls.jar"
    succeed: read template from "/opt/was/bea/weblogic81/common/templates/domains/wls.jar"<< find Server "myserver" as obj0
    succeed: find Server "myserver" as obj0<< set obj0 attribute ListenAddress to ""
    succeed: set obj0 attribute ListenAddress to ""<< set obj0 attribute ListenPort to "7001"
    succeed: set obj0 attribute ListenPort to "7001"<< find User "weblogic" as obj1
    succeed: find User "weblogic" as obj1<< set obj1 attribute Password to "********"
    succeed: set obj1 attribute Password to "********"<< set config option OverwriteDomain to "true"
    succeed: set config option OverwriteDomain to "true"<< write Domain to "/opt/was/ravi/user_projects/mydomain"
    esourceBundleManager - Retrieved (Everyone of all groups.) under key (SecurityDesc.group.everyone) from namespace <config>.
    2004-09-10 08:52:13,397 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (built-in anonymous role) under key (SecurityDesc.role.anonymous) from namespace <config>.
    2004-09-10 08:52:13,649 INFO [main] com.bea.plateng.domain.TemplateBuilder - _apps_ not found in the template jar. Assuming old template structure.
    2004-09-10 08:52:14,647 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Target) under key (target) from namespace <config>.
    2004-09-10 08:52:14,664 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Server) under key (Server) from namespace <config>.
    2004-09-10 08:52:14,779 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Application) under key (application) from namespace <config>.
    2004-09-10 08:52:14,863 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Target) under key (target) from namespace <config>.
    2004-09-10 08:52:14,867 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Server) under key (Server) from namespace <config>.
    2004-09-10 08:52:14,875 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Service) under key (service) from namespace <config>.
    2004-09-10 08:52:14,879 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Migratable RMI Service) under key (migratableRMIService) from namespace <config>.
    2004-09-10 08:52:14,883 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Shutdown Class) under key (shutdownClass) from namespace <config>.
    2004-09-10 08:52:14,887 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Startup Class) under key (startupClass) from namespace <config>.
    2004-09-10 08:52:14,891 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (File T3) under key (fileT3) from namespace <config>.
    2004-09-10 08:52:14,896 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Messaging Bridge) under key (messagingBridge) from namespace <config>.
    2004-09-10 08:52:14,910 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Jolt Connection Pool) under key (joltConnectionPool) from namespace <config>.
    2004-09-10 08:52:14,916 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (WLEC Connection Pool) under key (wlecConnectionPool) from namespace <config>.
    2004-09-10 08:52:14,932 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (WTC Server) under key (wtcServer) from namespace <config>.
    2004-09-10 08:52:15,165 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to replace component: FileStore of type: com.bea.plateng.domain.xml.config.JMSFileStoreType in
    2004-09-10 08:52:15,301 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to remove component: FileStore of type: com.bea.plateng.domain.xml.config.JMSFileStoreType from
    2004-09-10 08:52:15,323 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to find component: FileStore of type: com.bea.plateng.domain.xml.config.JMSFileStoreType in
    2004-09-10 08:52:15,356 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to add component: FileStore of type: com.bea.plateng.domain.xml.config.JMSFileStoreType to
    2004-09-10 08:52:15,376 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Added: FileStore to
    2004-09-10 08:52:15,379 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Component: FileStore of type: com.bea.plateng.domain.xml.config.JMSFileStoreType was replaced in
    2004-09-10 08:52:15,403 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to replace component: RMDefaultPolicy of type: com.bea.plateng.domain.xml.config.WSReliableDeliveryPolicyType in
    2004-09-10 08:52:15,407 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to remove component: RMDefaultPolicy of type: com.bea.plateng.domain.xml.config.WSReliableDeliveryPolicyType from
    2004-09-10 08:52:15,409 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to find component: RMDefaultPolicy of type: com.bea.plateng.domain.xml.config.WSReliableDeliveryPolicyType in
    2004-09-10 08:52:15,425 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Attempting to add component: RMDefaultPolicy of type: com.bea.plateng.domain.xml.config.WSReliableDeliveryPolicyType to
    2004-09-10 08:52:15,432 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Added: RMDefaultPolicy to
    2004-09-10 08:52:15,435 DEBUG [main] com.bea.plateng.domain.ApplicationTemplate - Component: RMDefaultPolicy of type: com.bea.plateng.domain.xml.config.WSReliableDeliveryPolicyType was replaced in
    2004-09-10 08:52:15,545 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (WS Reliable Delivery Policy) under key (WSReliableDeliveryPolicy) from namespace <config>.
    2004-09-10 08:52:15,548 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (JMS JDBC Store) under key (jms.jdbcStore) from namespace <config>.
    2004-09-10 08:52:15,551 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (JMS File Store) under key (jms.fileStore) from namespace <config>.
    2004-09-10 08:52:15,661 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,669 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,676 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,678 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,680 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,750 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (group) from namespace <config>.
    2004-09-10 08:52:15,757 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,760 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,762 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,765 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,796 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Role) under key (Role) from namespace <config>.
    2004-09-10 08:52:15,801 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,807 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,813 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,835 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,837 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,839 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,842 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,846 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,848 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (Group) under key (Group) from namespace <config>.
    2004-09-10 08:52:15,851 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User) under key (User) from namespace <config>.
    2004-09-10 08:52:15,967 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: read template from "/opt/was/bea/weblogic81/common/templates/domains/wls.jar"
    2004-09-10 08:52:16,399 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - find Server "myserver" as obj0
    2004-09-10 08:52:16,636 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.Name) from namespace <config>.
    2004-09-10 08:52:16,642 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Name) from namespace <config>.
    2004-09-10 08:52:16,648 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.ListenAddress) from namespace <config>.
    2004-09-10 08:52:16,659 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenAddress) from namespace <config>.
    2004-09-10 08:52:16,666 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.ListenPort) from namespace <config>.
    2004-09-10 08:52:16,670 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:16,674 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (SSL.ListenPort) from namespace <config>.
    2004-09-10 08:52:16,676 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:16,679 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (SSL.Enabled) from namespace <config>.
    2004-09-10 08:52:16,681 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Enabled) from namespace <config>.
    2004-09-10 08:52:16,686 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: find Server "myserver" as obj0
    2004-09-10 08:52:17,406 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - set obj0 attribute ListenAddress to ""
    2004-09-10 08:52:17,441 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: set obj0 attribute ListenAddress to ""
    2004-09-10 08:52:17,455 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - set obj0 attribute ListenPort to "7001"
    2004-09-10 08:52:17,463 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: set obj0 attribute ListenPort to "7001"
    2004-09-10 08:52:17,467 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - find User "weblogic" as obj1
    2004-09-10 08:52:17,512 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (User name) under key (User.Name) from namespace <config>.
    2004-09-10 08:52:17,516 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (User.UserPassword) from namespace <config>.
    2004-09-10 08:52:17,517 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UserPassword) from namespace <config>.
    2004-09-10 08:52:17,522 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (User.ConfirmUserPassword) from namespace <config>.
    2004-09-10 08:52:17,526 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ConfirmUserPassword) from namespace <config>.
    2004-09-10 08:52:17,528 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (User.Description) from namespace <config>.
    2004-09-10 08:52:17,530 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Description) from namespace <config>.
    2004-09-10 08:52:17,534 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: find User "weblogic" as obj1
    2004-09-10 08:52:17,575 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - set obj1 attribute Password to "********"
    2004-09-10 08:52:17,579 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: set obj1 attribute Password to "********"
    2004-09-10 08:52:17,582 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - set config option OverwriteDomain to "true"
    2004-09-10 08:52:17,584 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - succeed: set config option OverwriteDomain to "true"
    2004-09-10 08:52:17,598 INFO [main] com.bea.plateng.domain.script.ScriptExecutor - write Domain to "/opt/was/ravi/user_projects/mydomain"
    2004-09-10 08:52:17,822 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.Name) from namespace <config>.
    2004-09-10 08:52:17,826 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Name) from namespace <config>.
    2004-09-10 08:52:17,828 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.ListenAddress) from namespace <config>.
    2004-09-10 08:52:17,832 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenAddress) from namespace <config>.
    2004-09-10 08:52:17,835 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Server.ListenPort) from namespace <config>.
    2004-09-10 08:52:17,837 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:17,840 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (SSL.ListenPort) from namespace <config>.
    2004-09-10 08:52:17,845 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:17,847 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (SSL.Enabled) from namespace <config>.
    2004-09-10 08:52:17,848 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Enabled) from namespace <config>.
    2004-09-10 08:52:17,947 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Cluster.Name) from namespace <config>.
    2004-09-10 08:52:17,952 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Name) from namespace <config>.
    2004-09-10 08:52:17,957 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Cluster.MulticastAddress) from namespace <config>.
    2004-09-10 08:52:17,960 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (MulticastAddress) from namespace <config>.
    2004-09-10 08:52:17,963 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Cluster.MulticastPort) from namespace <config>.
    2004-09-10 08:52:17,966 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (MulticastPort) from namespace <config>.
    2004-09-10 08:52:17,968 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Cluster.ClusterAddress) from namespace <config>.
    2004-09-10 08:52:17,970 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ClusterAddress) from namespace <config>.
    2004-09-10 08:52:18,086 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Machine.Name) from namespace <config>.
    2004-09-10 08:52:18,089 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Name) from namespace <config>.
    2004-09-10 08:52:18,091 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (NodeManager.ListenAddress) from namespace <config>.
    2004-09-10 08:52:18,095 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenAddress) from namespace <config>.
    2004-09-10 08:52:18,096 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (NodeManager.ListenPort) from namespace <config>.
    2004-09-10 08:52:18,098 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:18,142 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UnixMachine.Name) from namespace <config>.
    2004-09-10 08:52:18,146 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (Name) from namespace <config>.
    2004-09-10 08:52:18,148 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UnixMachine.PostBindGIDEnabled) from namespace <config>.
    2004-09-10 08:52:18,150 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (PostBindGIDEnabled) from namespace <config>.
    2004-09-10 08:52:18,156 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UnixMachine.PostBindGID) from namespace <config>.
    2004-09-10 08:52:18,158 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (PostBindGID) from namespace <config>.
    2004-09-10 08:52:18,160 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UnixMachine.PostBindUIDEnabled) from namespace <config>.
    2004-09-10 08:52:18,162 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (PostBindUIDEnabled) from namespace <config>.
    2004-09-10 08:52:18,165 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (UnixMachine.PostBindUID) from namespace <config>.
    2004-09-10 08:52:18,166 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (PostBindUID) from namespace <config>.
    2004-09-10 08:52:18,168 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (NodeManager.ListenAddress) from namespace <config>.
    2004-09-10 08:52:18,171 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenAddress) from namespace <config>.
    2004-09-10 08:52:18,174 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (NodeManager.ListenPort) from namespace <config>.
    2004-09-10 08:52:18,177 DEBUG [main] com.bea.plateng.common.util.ResourceBundleManager - Retrieved (null) under key (ListenPort) from namespace <config>.
    2004-09-10 08:52:18,730 DEBUG [main] com.bea.plateng.domain.jdbc.JDBCHelper - jdbcdrivers.xml not found in classpath, trying "../../server/lib/jdbcdrivers.xml"
    2004-09-10 08:52:18,737 DEBUG [main] com.bea.plateng.domain.jdbc.JDBCHelper - jdbcdrivers.xml not found at "/opt/server/lib/jdbcdrivers.xml", giving up...
    2004-09-10 08:52:18,917 ERROR [main] com.bea.plateng.domain.jdbc.JDBCHelper - weblogic.xml.stream.XMLStreamException: Unable to instantiate the stream, the error was: Unable to resolve input source.
    weblogic.xml.stream.XMLStreamException: Unable to instantiate the stream, the error was: Unable to resolve input source.
         at weblogic.xml.babel.stream.XMLInputStreamBase.open(XMLInputStreamBase.java:91)
         at weblogic.xml.babel.stream.XMLInputStreamBase.open(XMLInputStreamBase.java:49)
         at weblogic.xml.babel.stream.XMLInputStreamFactoryImpl.newInputStream(XMLInputStreamFactoryImpl.java:67)
         at weblogic.xml.babel.stream.XMLInputStreamFactoryImpl.newInputStream(XMLInputStreamFactoryImpl.java:49)
         at weblogic.xml.babel.stream.XMLInputStreamFactoryImpl.newInputStream(XMLInputStreamFactoryImpl.java:79)
         at weblogic.jdbc.utils.JDBCConnectionMetaDataParser.loadSchema(JDBCConnectionMetaDataParser.java:216)
         at weblogic.jdbc.utils.JDBCConnectionMetaDataParser.<init>(JDBCConnectionMetaDataParser.java:127)
         at com.bea.plateng.domain.jdbc.JDBCHelper.getDriverInfoFactory(JDBCHelper.java:393)
         at com.bea.plateng.domain.jdbc.JDBCAspectHelper.initDriverMap(JDBCAspectHelper.java:613)
         at com.bea.plateng.domain.jdbc.JDBCAspectHelper.getJDBCDriverClassTable(JDBCAspectHelper.java:602)
         at com.bea.plateng.domain.jdbc.JDBCAspectHelper.getGenericJDBCDriverInfo(JDBCAspectHelper.java:731)
         at com.bea.plateng.domain.jdbc.JDBCAspectHelper.getGenericJDBCDriverInfo(JDBCAspectHelper.java:211)
         at com.bea.plateng.domain.aspect.JDBCConnectionPoolDriverNameConfigAspect.decompose(JDBCConnectionPoolDriverNameConfigAspect.java:54)
         at com.bea.plateng.domain.aspect.ConfigAspectImpl.setDelegate(ConfigAspectImpl.java:493)
         at com.bea.plateng.domain.aspect.ConfigAspectBuilder.createJDBCConnectionPoolSimpleAspect(ConfigAspectBuilder.java:367)
         at com.bea.plateng.domain.operation.config.ConfigJDBCConnectionPool.createNewSimpleConfigAspects(ConfigJDBCConnectionPool.java:121)
         at com.bea.plateng.domain.operation.HTableEditOperation.createSimpleTableModel(HTableEditOperation.java:647)
         at com.bea.plateng.domain.operation.HTableEditOperation.getSimpleTableModel(HTableEditOperation.java:299)
         at com.bea.plateng.domain.operation.HTableEditOperation.initSimpleTableModel(HTableEditOperation.java:531)
         at com.bea.plateng.domain.DomainChecker.isOperationValid(DomainChecker.java:590)
         at com.bea.plateng.domain.DomainChecker.getInvalidSection(DomainChecker.java:155)
         at com.bea.plateng.domain.GeneratorHelper.validateDomainCreation(GeneratorHelper.java:82)
         at com.bea.plateng.domain.script.ScriptExecutor.writeDomain(ScriptExecutor.java:516)
         at com.bea.plateng.domain.script.jython.WLScriptContext.writeDomain(WLScriptContext.java:453)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx0.writeDomain$14(initWls.py:70)
         at org.python.pycode._pyx0.call_function(initWls.py)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyFunction.__call__(PyFunction.java)
         at org.python.pycode._pyx1.f$0(wls.py:9)
         at org.python.pycode._pyx1.call_function(wls.py)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.__builtin__.execfile_flags(__builtin__.java)
         at org.python.util.PythonInterpreter.execfile(PythonInterpreter.java)
         at com.bea.plateng.domain.script.jython.WLST_offline.main(WLST_offline.java:50)
    =========================================================="
    the script is below
    "++++++++++++++++++++++++++++++++++++++++++++++++++++
    readTemplate('/opt/was/bea/weblogic81/common/templates/domains/wls.jar')
    cd('Server/myserver')
    set('ListenAddress','')
    set('ListenPort',7001)
    cd('/Security/mydomain')
    cd('User/weblogic')
    cmo.setPassword('weblogic')
    setOption('OverwriteDomain','true')
    writeDomain('/opt/was/ravi/user_projects/mydomain')
    dumpStack()
    dumpVariables()
    closeTemplate()
    exit()
    ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

  • A mapped elements relationship with other elements cannot be preserved

    Hi,
    I am trying to create an XML file from an XSD using excel macros and data from excel worksheet. The mapping takes place fine. However, when I try to save the mapped items into an XML file using SaveAsXML function, it gives me the following reason why the
    XML is not exportable:
    A mapped elements relationship with other elements cannot be preserved
    Could someone please help is resolving this issue?
    Thanks

    Preamble: I am not a specialist, on the contrary. I discovered xml a few days ago trying to submit returns based on two quite different compulsory xsd schemas established by local authorities. So this note has no other pretention than
    to try to help others starting from same level.
    The plus: I discovered I could do it (with limitations) from Excel 2010 by attaching the provided xsd and mapping the relevant fields to excel data cells. This is reasonably described on MS sites & elsewhere, although the MS tutorials
    do not focus on this particular objective, which is important as authorities are using more and more XML to expect returns.
    The minus: I stumbled across stupid issues (in fact pulling my hair...), which could be better highlighted in MS doc for the newbies like me, because at that stage of knowledge and on your own, all seems unsurmountable. When Excel gives
    the error messages, there is little indication on why and where the error occurs. My two common errors difficult to debug:
    1. Denormalized data error
    2. Relationship cannot be preserved error
    What I was doing wrong for the first xsd then even for the second by accident:
    1. In error, I was mapping an element occurring max once in the XSD schema to an element in an "Excel Table" (the “new” table formatting available in Excel 2010). I solved this by de-mapping the culprit element, converting the "Excel Table"
    to ordinary ranges using the button provided in the ribbon for this, and re-mapping. From there, (nearly) each time I was mapping a new element / attribute to a cell in the worksheet, I used the button "Verify map for Export" giving me debugging message soon
    enough before mapping the whole data set and finding it was wrong a bit late and not knowing exactly where it started to go wrong. Also, for the repeating component, I had set manually the Excel table for the whole recursive element beforehand as I found out
    that when using the dragging process, the results could be random (some attributes in the table, others in other tables resulting in errors type 2 relationship).
    2. Relationship error is a more nagging issue as it seems to relate to a structural limitation of this Excel 2010 export mechanism. Flattening a database with several depth levels (list of lists) is not trivial, so Microsoft stipulates
    a bit buried in notes while it should be in bold at the top that Excel (which version? also 2013?) "does not support recursive structures that are more than one level deep". Happily, while the xml schemas I had to follow where providing for several levels
    deep (3 in my case), I needed only one for both xml files.  So, I copied the “too complex” xsd schema, updated it manually to suppress 2 levels. I remapped and tested. The error messages had gone. I exported. Then, I had to simply add the
    previously deleted opening & closing tags manually using a free editor (Vim with thanks to that author).  
    But unless my understanding is wrong, which is quite possible, readers should understand that this documented limitation can become a structural stumbling block if they need absolutely recursivity with more than one level depth (list of lists). So, if
    I can do it in Access from a DB (I still need to find out how & forum guidance is more than welcomed), this will be my future way to explore.
    To finish, I validated the exported xml against the compulsory xsd schema using the online tool provided free by CoreFiling to ensure basic formatting mistakes had not been committed before submitting the returns for online validation
    by Authorities application. I have MS Visual Studio Express installed which may help as well but I don’t know how to use it for this.
    NB: I found also easier to copy (dragging top of the xsd tree) all the xsd elements /attributes in bulk from the xsd schema from the box on the right to a worksheet. While results would not be useable “as is” to map (it would most probably
    give errors), at least, it provides a good initial template with the right headings for all the elements / attributes facilitating restructuring (occurring once, elements part of a recursive table) then map properly and reasonably fast. Before doing this,
    I was copying manually the xml headings from the Word documentation provided by Authorities, which worked but was more cumbersome and risky.
    Conclusion
    I had never read about xml before (just heard often about it). It took me around 6-7 hours to learn some critical basics about it and about these specific xsd schemas. With a few trials & errors, I could use Excel 2010 to produce the
    xml files required for the two schemas I needed. The great advantage compared to an online submission of data to Authorities is that it a process easy to repeat quietly at work (or at home) and enabling to check peacefully and thoroughly the xml data files
    before submitting them.  If wrong submissions have been made, it enables automated corrections much easier as well. It is within reach for users with reasonable but not outstanding Excel / Xml knowledge but it requires some initial time investment.  
    Hope this saves time for others.
    acontrario
    Brussels

  • Is it possible to create a 'General' plugin with C++ SDK?

    The SDK mentions that we can create 'Automation', 'Export', etc plugins with C++? Do these types have their limitations? For example, can't an 'Export' plugin access the whole API exposed to an 'Automation' plugin? Or is the only difference being shown in different sections in the menus?
    What i need is to build a general-purpose plugin that might have to use some export functionality along with selection and 3D features.
    Thanks!

    The short answer is yes, in addition to their location under different sections in the PS menu, each plugin has it's own internal limitations, sequence of invocation etc. The closest you can get to the "general purpose" plugin is an automation plugin as it can be triggered upon various events and can also invoke other plugins, A common approach when developing a plugin with mixed functionality is to have a main "dispatcher" automation plugin that either invokes other plugins as helpers (e.g. a hidden filter to process pixel data) or is invoked by other plugins either upon the events or via an exported suite. You may to browse the Automation examples in the SDK samplecode folder.
    I would also suggest looking into the "Plugin modules" section of the SDK documentation - it should have the information you need in terms of structure, limitations etc The FAQ section is also worh reading.

  • Strategy for kerning groups

    Hello,<br /><br />Is there any strategy available to avoid that MakeOTF can`t access some<br />kernings?<br /><br />makeotflib [WARNING] <FONT_X> Start of new pair positioning subtable<br />; some pairs may never be accessed: [t] [r r.alt]<br /><br />If I write the (subtable;) command I can "fix" this message but the<br />generated code is invalid.<br /><br />An other option is to split every kernig group into single kernigs, but<br />this is an enormous task and a new source of failures.<br /><br /><br /> <br />Andreas

    This message shows up when MakeOTF is forced to split a lookup table in two subtables in order to separate conflicting glyph class definitions.
    You may remember that in class kerning, all the left side glyphs classes that are applied in a single lookup table must be mutually exclusive; that is, a glyph may not belong to more than one left side class in a single table. Same for right side classes. When you include the same glyph in different classes on the same side in your list of kern class pairs, then MakeOTF can solve the structural limitation only by breaking the lookup table in two, such that the the conflicting classes are used in different sub-tables. However, this solves only the structural problem - you still have a functional problem. In your example, the first kern pair which uses the glyph 't' on the left side will mask any subseqent kern pair that includes the glyph "t" on the same side, as all right-side glyphs NOT paired with the glyph "t" with the first subtable will be assigned kern value of zero, and applications will never proceed to see that there are additonal kern paris with "t" on the left side. This is because for any lookup-table, all the glyphs not included in any class on the left or right side are put in an invisible class, and assigned a value of zero.
    What this message should say is " you made an error in building either your left side or right side classes for use in the current lookup table- please fix the class definitions so that they are mutually exclusive."
    One way to imagine all this is to imagine the class pairs as a spreadsheet of all possible class pairs. Each left side class is the title of a row, and each right side class is the title of a column. A glyph can be put in only one row title, and in only one column titile. All glyphs not named in a row title get put together in a special row title. All glyphs not named in a column title get put together in a special column title. When you specify the value of a class pair, you are specifiying the value in one cell of the spreadsheet. All cells for which no values are specified are set to 0, by default. When programs look for a kern value between "t" and something else, they look through the list of left side class definitions to find the first occurrence of 't'. By definition, the spreadsheet row for "t" defines the kern pair value of "t" with all other glyphs, and the programs does not look further. As a result, any class kern pairs with "t" on the left side in subseuqent subtables will never get seen.
    Using the 'subtable' keyword fixes only the structural problem - MakeOTF can then build the feature as specified - but it is still broken from the point of view of intended function.
    I am curious about the secondary problem you mention. When you use the 'subtable' keyword, in what way is the generated code invalid?

  • Maximum number of open files..

    I'm looking for some help...probably a consultant to give us a call.
    I need to know the following:
    For 2.6 and 7, number of open files per process default and maximum setting.
    Procedure to change the default setting to the maximum
    Amont of RAM required to handle the max setting.
    Risks inherent in setting this parameter to the max.
    Any info on test environments where max setting has been utilized (e.g. datase TPC benchmarks, etc..)
    Feel free to call 408.861.1103 - happy to pay for the advice.

    Hi!
    The maximum number of file descriptors per process is set by two parameters:
    rlim_fd_cur (soft limit, defaults to 64)
    rlim_fd_max (hard limit, defaults to 1024)
    Processes may raise their soft limit up to the hard limit using setrlimit(2).
    Setting rlim_fd_cur high is not a problem as the file desciptors are allocated in chunks of 24 as required, and so not all in one go. They don't actually require that much memory either.
    As administrator you may set the limits by adding an entry to /etc/system, eg:
    set rlim_fd_max=600
    and rebooting.
    Note however on 32 bit solaris, the significant limitation is that the stdio library FILE structure limits your process to 256 fds. This is increased to 65536 for 64bit programs on solaris 7.
    Select(3c) can use up to 65536 fds (#define FD_SETSIZE 65536 in your code for 32bit solaris 7).
    Hope that helps.
    Ralph
    SUN DTS

  • Installing Office for Mac 2008

    I have the opportunity through work to purchase a full download of the above for a very cheap price. I currently have Office for Mac 2004 installed.
    Questions :
    1. Should I uninstall my current version or install over the top ?
    2. Does 2008 run natively ( I think that's the expression ) on Intel chips unlike 2004 ?

    On both the Office 2004 CD and in the folder it was installed to, you'll find a program named Remove Office. That will delete everything on the hard drive related to Office 2004 and any tryout version it finds.
    Make sure to backup your Office data first!. That would be any Word, Excel and PowerPoint documents you've created, and your Entourage data. In your user account, your Entourage data is in a folder named Main Identity. Back that up!
    When you install Office 2008, it will create the same folder structure as 2004 for Entourage (except the folder Main Identity goes in will be name with "2008"). You can then put your Main Identity folder in the same spot it was before and Entourage 2008 will pick it right up.
    It's good to remove Office 2004 as 2008 uses a new Normal template for Word and new preference files. If the 2004 versions are still present, Office 2008 will run very slow.

  • Difference in PPM and PDS.....?

    HI ,
    What are the difference between the production process
    model (PPM) and the production data structure (PDS).....?
    are they totally similar or there are specific applications in which either one of them is only usefull.....?
    Thanks in advance.
    Regards

    Hi ,
    The PPM is the master data which was available from the very beginning
    of APO but will not be developed any further due to some structural limitations
    (mainly the lack of engineering change management). The alternative
    to the PPM is the PDS. The functionality of the PDS was increased for
    APO 4.1. In the releases APO 3.1 and 4.0 the PDS was called run-time object
    (RTO). The SAP recommendation is to use PDS for new implementations
    whenever possible.
    Also check below link
    Re: Can we use PDS for few material and PPM for other material in SCM sysytem ?
    also refer  SAP note 1079959 to get the full detailed information on differences between PPM & PDS.
    Hope it helps you.
    Regards
    Ritesh

  • Can AIR use the IOS Significant-Change Location Service?

    There seems to be no discussion on this topic what so ever so far as I can tell...
    Does anyone know if this can be used in AIR?

    There is no searchable information on AIR / AS3 / ANE implementing the IOS Significant change service.  
    Possibly AIR via Native Extension cannot do this?   Perhaps from some structural limitation of AIR?
    If someone has done this before with an ANE, enlighten us.

  • MuVo V100 Prob

    Hello,?I have had the Creative MuVo V00 for quite a few months now. Up till very recently, I have been having no problems with it until now.
    I added some new songs to?a folder?and when I play the folder on the Mp3?it skips?the songs completely and I am not able to find them!?However, it'sworks perfectly fine on the computer. At first I put?them in a folder?(which already had songs in it before)?under 'Root' and then?I moved all of them under 'Library B'. It still skips the same songs in both folders when I try playing it. Another problem is that sometimes, the order of the folders under 'Root' change. If anyone has solutions to these problems, I would be very grateful!

    I don't think that the MuVo series can address sub folders [folders within other folders].
    I save albums in separate folders on my computer and then drag an entire album folder to my MuVo, so I'll usually have 6 or more folders at a time in my 52 Mb MuVo.
    You could do it as playlists too- a group of songs in each folder, but I don't think that the MuVo can access a folder inside another folder, because of file structure limitations.

  • Sh dlsw reachability searching

    Hello,
    I am trying to configure dlsw between two remote networks and it is very very slow. I wonder why the router is not founding some local macs. Could you please give me some advices? This is the show commands output
    sh dlsw reach
    DLSw Local MAC address reachability cache list
    Mac Addr         status     Loc.    port                 rif
    0006.299c.1cb9   SEARCHING  LOCAL
    0040.cd53.c834   SEARCHING  LOCAL
    0040.cd53.c8e4   SEARCHING  LOCAL
    0064.99c6.5903   FOUND      LOCAL   TBridge-001    --no rif--
    1000.5a58.05f3   SEARCHING  LOCAL
    1000.d000.1244   SEARCHING  LOCAL
    1000.d000.1261   SEARCHING  LOCAL
    1000.d000.6800   SEARCHING  LOCAL
    1000.d000.a809   SEARCHING  LOCAL
    1000.d07f.04aa   SEARCHING  LOCAL
    1000.d07f.90bc   SEARCHING  LOCAL
    1000.d07f.b5b9   SEARCHING  LOCAL
    4200.0099.8984   SEARCHING  LOCAL
    4200.24ff.66c3   FOUND      LOCAL   TBridge-001    --no rif--
    sh dlsw circuits
    Index           local addr(lsap)    remote addr(dsap)  state          uptime
    2617245708      0090.d6a6.6933(04)  4200.0099.8980(04) CONNECTED      03:16:46
    Total number of circuits connected: 1
    sh dlsw peer
    Peers:                state     pkts_rx   pkts_tx  type  drops ckts TCP   uptime
    TCP 10.10.218.241   CONNECT     327135     59912  conf      0    1   0 03:19:36
    Total number of connected peers: 1
    Total number of connections:     1
    interface Vlan229
    description LINEA ADMINISTRATIVA
    ip address 10.60.229.240 255.255.255.0 secondary
    ip address 10.60.2.240 255.255.255.0
    no ip redirects
    bridge-group 1
    llc2 ack-max 255
    llc2 local-window 2
    llc2 t1-time 10000
    interface Vlan232
    description LINEA SISTEMA ESPEJO AS/400 CONTINGENCIA
    ip address 10.60.232.240 255.255.255.0
    ip policy route-map FRAGMENT
    bridge-group 1
    end
    Thanks!!

    No, limit of 4 is a software structure limitation.

  • What is the limitations of structure in query?

    HI,
    What is the limitations of structure.How many structure we can able to create on query?.
    Does there any limit on it?

    Hi,
    You just can create 2 structures in a query. It is a big limitation because it is necessary a third one in a lot of cases.
    I hope it helps.

Maybe you are looking for

  • Inspection lot  is not getting updated in quant

    Hi Experts, Need your help.. I have posted two material doc with same material following is the consequences : 1.we have immediate TO creation process , so now we have 2 TO's and 2 inspection lots. 2.When i am doing UD for first inspection lot PCN do

  • With the Firefox upgrade I can no longer download pictures from my camera to my iMac osx. Please help.

    Before i upgraded to the latest Firefox, I could easily download photos from my Lumix to my computer, an iMac 24 inch with OSX 10.4.1. Since the Firefox upgrade, my computer does not accept the camera. No icon appears on my screen. Neither is there a

  • Exporting 1080P25 to Blu-Ray

    I'm working on a film shot in 1440*1080p25, and I'll eventualy want to put it on Blu-Ray disc. In Premiere, when I look at Export > Media > MPEG2 Blu-Ray, 1080p25 is not listed, but 1080i25 is. Also in the booklet for my camera (Canon HV30) it says 1

  • Disk Drive not working

    I have an old G4, which although is not my main computer, I still use. The issue is the disk drive doesnt work. I can insert a disk fine, but it never makes the whirring sound of a disk being loaded and no disk ever pops up on the screen. I am hoping

  • Notes in SPAU

    Hi in SPAU transaction there are notes , so what i have to do in upgradation with notes & some notes have yellow light & green tick smbol & question mark symbol how to take care of ABAP query in upgradation are the abap querys come after upgradation