Oss not compiling

I have upgraded arch recently.
$ uname -r
3.9.5-1-ARCH
After upgrading oss does not work, it says:
"$ sudo soundon
modprobe: ERROR: could not insert 'osscore': Invalid argument
Loading the osscore module failed"
I have oss 4.2_2007-4 installed.
If I reboot a starting file is created in /usr/lib/oss folder.
If I remove this file and perform a soundon again, the file is recreated again and the same message error is shown.
Any ideas?
Last edited by toni (2013-07-27 12:52:49)

As said in previous post I was using oss-4.2_2007-2 as oss-4.2_2007-4 was not working with the kernel 3.9.5-1-ARCH so I downgraded oss and the issue was solved.
Now I have upgraded kernel to 3.10.2-1-ARCH and neither oss-4.2_2007-2 nor  oss-4.2_2007-4 are working with this kernel version so I have no sound in my arch.
output for 4.2_2007-4:
$ systemctl status oss.service
oss.service - Open Sound System v4
Loaded: loaded (/usr/lib/systemd/system/oss.service; enabled)
Active: failed (Result: exit-code) since sáb 2013-07-27 16:08:36 CEST; 5min ago
Process: 453 ExecStart=/usr/bin/soundon (code=exited, status=20)
$ sudo soundon
Relinking OSS kernel modules for ""
This may take few moments - please stand by...
OSS build environment set up for REGPARM kernels
Building module osscore
Failed to compile OSS
make -C /usr/lib/modules/3.10.2-1-ARCH/build M=/usr/lib/oss/build modules
make[1]: Entering directory `/usr/src/linux-3.10.2-1-ARCH'
CC [M] /usr/lib/oss/build/osscore.o
/usr/lib/oss/build/osscore.c: En la función ‘init_proc_fs’:
/usr/lib/oss/build/osscore.c:287:8: error: declaración implícita de la función ‘create_proc_entry’ [-Werror=implicit-function-declaration]
create_proc_entry ("opensound", 0700 | S_IFDIR, NULL)) == NULL)
^
/usr/lib/oss/build/osscore.c:286:22: aviso: la asignación crea un puntero desde un entero sin una conversión [activado por defecto]
if ((oss_proc_root =
^
/usr/lib/oss/build/osscore.c:293:26: aviso: la asignación crea un puntero desde un entero sin una conversión [activado por defecto]
if ((oss_proc_devfiles =
^
/usr/lib/oss/build/osscore.c:300:20: error: puntero deferenciado a tipo de dato incompleto
oss_proc_devfiles->proc_fops = &oss_proc_operations;
^
/usr/lib/oss/build/osscore.c: En la función ‘alloc_fop’:
/usr/lib/oss/build/osscore.c:964:14: aviso: asignación desde un tipo de puntero incompatible [activado por defecto]
fop->fsync = oss_no_fsync;
^
/usr/lib/oss/build/osscore.c: En la función ‘oss_pci_read_devpath’:
/usr/lib/oss/build/osscore.c:1638:3: aviso: la devolución descarta el calificador ‘const’ del tipo del destino del puntero [activado por defecto]
return dev_name(&dip->pcidev->dev);
^
In file included from include/linux/kernel.h:12:0,
from include/linux/cache.h:4,
from include/linux/time.h:4,
from include/linux/stat.h:18,
from include/linux/module.h:10,
from /usr/lib/oss/build/osscore.c:15:
/usr/lib/oss/build/osscore.c: En la función ‘oss_fp_check’:
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:73:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:112:3: nota: in expansion of macro ‘raw_local_save_flags’
raw_local_save_flags(flags); \
^
/usr/lib/oss/build/osscore.c:1862:3: nota: in expansion of macro ‘local_save_flags’
local_save_flags (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:78:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:102:7: nota: in expansion of macro ‘raw_irqs_disabled_flags’
if (raw_irqs_disabled_flags(flags)) { \
^
/usr/lib/oss/build/osscore.c:1864:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:68:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:103:4: nota: in expansion of macro ‘raw_local_irq_restore’
raw_local_irq_restore(flags); \
^
/usr/lib/oss/build/osscore.c:1864:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:68:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:107:4: nota: in expansion of macro ‘raw_local_irq_restore’
raw_local_irq_restore(flags); \
^
/usr/lib/oss/build/osscore.c:1864:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:73:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:112:3: nota: in expansion of macro ‘raw_local_save_flags’
raw_local_save_flags(flags); \
^
/usr/lib/oss/build/osscore.c:1866:3: nota: in expansion of macro ‘local_save_flags’
local_save_flags (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:78:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:102:7: nota: in expansion of macro ‘raw_irqs_disabled_flags’
if (raw_irqs_disabled_flags(flags)) { \
^
/usr/lib/oss/build/osscore.c:1871:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:68:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:103:4: nota: in expansion of macro ‘raw_local_irq_restore’
raw_local_irq_restore(flags); \
^
/usr/lib/oss/build/osscore.c:1871:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:68:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:107:4: nota: in expansion of macro ‘raw_local_irq_restore’
raw_local_irq_restore(flags); \
^
/usr/lib/oss/build/osscore.c:1871:3: nota: in expansion of macro ‘local_irq_restore’
local_irq_restore (flags_reg);
^
include/linux/typecheck.h:11:18: aviso: la comparación de diferentes tipos de puntero carece de una conversión [activado por defecto]
(void)(&__dummy == &__dummy2); \
^
include/linux/irqflags.h:73:3: nota: in expansion of macro ‘typecheck’
typecheck(unsigned long, flags); \
^
include/linux/irqflags.h:112:3: nota: in expansion of macro ‘raw_local_save_flags’
raw_local_save_flags(flags); \
^
/usr/lib/oss/build/osscore.c:1873:3: nota: in expansion of macro ‘local_save_flags’
local_save_flags (flags_reg);
^
cc1: algunos avisos se tratan como errores
make[2]: *** [/usr/lib/oss/build/osscore.o] Error 1
make[1]: *** [_module_/usr/lib/oss/build] Error 2
make[1]: Leaving directory `/usr/src/linux-3.10.2-1-ARCH'
make: *** [default] Error 2
Relinking the OSS kernel modules failed
Any ideas?
Last edited by toni (2013-07-27 12:20:59)

Similar Messages

  • OSS Notes Please

    Dear Gurus,
    Can anyone send me the following OSS notes if u have.  567747, 130253 and 417307.  Your kind help will be definitely awarded.
    My mail ID is [email protected]
    Best Regards
    Mohan Kumar
    Message was edited by: mohan kumar

    hi Mohan,
    i think you will need to have access to oss yourself,
    note 567747 is composite one that contains several notes.
    sent to you mail ...
    130253
    Symptom
    Uploading transaction data to BW takes too long
    Other terms
    Business Warehouse, data upload, batch upload, transaction data upload,
    performance, runtime, data load, SPLIT_PARTITION_FAILED ORA00054
    Reason and Prerequisites
    Loading data from a mySAP system (for example, R/3) or from a file takes a very long time.
    Solution
    The following tips are a general check list to make the mass data upload to the Business Warehouse (BW) System as efficient as possible.
    Tip 1:
    Check the parameter settings of the database as described in composite Note 567745.
    Check the basis parameter settings of the system.
    Note 192658 Setting basis parameters for BW Systems
    See the following composite notes:
    Note 567747 Composite note BW 3.x performance: Extraction & loading
    Note 567746 Composite note BW 3.x performance: Query & Web applications
    Tip 2:
    Import the latest BW Support Package and the latest kernel patch into your system.
    Tip 3:
    Before you upload the transaction data you should make sure that ALL relating master data has been loaded to your system. If no master has been loaded yet, the upload may take up to 100 percent longer because in this case, the system must retrieve master data IDs for the characteristic attributes and it must add new records to the master data tables.
    Tip 4:
    If possible, always use TRFC (PSA) as the transfer method instead of
    IDocs. If you (have to) use IDocs, keep the number of data IDocs as low
    as possible. We recommend an IDoc size of between 10000 (Informix) and 50000 (Oracle, MS SQL Server).
    To upload from a file, set this value in Transaction RSCUSTV6.
    To upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    Tip 5:
    If possible, load the data from a file on the application server and not from the client workstation as this reduces the network load. This also allows you to load in batch.
    Tip 6:
    If possible, use a fixed record length when you load data from a file (ASCII file). For a CSV file, the system only carries out the converison to a fixed record length during the loading process.
    Tip 7:
    When you load large data quantities from a file, we recommend that you split the file into several parts. We recommend using as many files of the same size as there are CPUs. You can then load these files simultaneously to the BW system in several requests. To do this, you require a fast RAID.
    Tip 8:
    When you load large quantities of data in InfoCubes, you should delete
    the secodary indexes before the loading process and then recreate them afterwards if the following applies: The number of the records that are loaded is big in comparison to the number of records that already exist in the (uncompressed) F fact table. For non-transactional InfoCubes, you must delete the indexes to be able to carry out parallel loading.
    Tip 9:
    When you load large quantities of data in an InfoCube, the number range buffer should be increased for the dimensions that are likely to have a high number of data sets.
    To do this, proceed as follows. Use function module RSD_CUBE_GET to find the object name of the dimension that is likely to have a high number of data sets.
    Function module settings:
    I_INFOCUBE = 'Infocube name'
    I_OBJVERS = 'A'
    I_BYPASS_BUFFER = 'X'
    The numbers for the dimensions are then contained in table 'E_T_DIME', column 'NUMBRANR'. If you enter 'BID' before this number, you get the relevant number range (for example BID0000053).
    You can use Transaction SNRO (-> ABAP/4 Workbench -> Development --> Other tools --> Number ranges) to display all number ranges for the dimensions used in BW if you enter BID*. You can use the object name that was determined beforehand to find the required number range.
    By double-clicking this line, you get to the number range maintenance. Choose Edit -> Set-up buffering -> Main memory, to define the 'No. of numbers in buffer'.
    Set this value to 500, for example. The size depends on the expected data quantity in the initial and in future (delta) uploads.
    !! Never buffer the number range for the package dimension !!
    Tip 10:
    When you load large quantities of data, you should increase the number
    range buffer for the info objects that are likely to have a high number of data sets. To do this, proceed as follows:
    Use function module RSD_IOBJ_GET to find the number range name of the info object that is likely to have a high number of data sets.
    Function module settings:
    I_IOBJNM = 'Info object name'
    I_OBJVERS = 'A'
    I_BYPASS_BUFFER = 'X'
    The number for the info object is in table 'E_S_VIOBJ', column 'NUMBRANR'. Enter 'BIM' in front of this number to get the required number range (for example BIM0000053).
    Use Transaction SNRO (-> ABAP/4 Workbench -> Development --> Other tools --> Number ranges) to display all number ranges used for the info objects in BW by entering BIM*. By entering the object name determined beforehand you can find the desired number range.
    By double-clicking this line you get to the number range object maintenance. Choose Edit -> Set-up buffering -> Main memory, to define the 'No. of numbers in buffer'.
    Set this value to 500, for example. The size depends on the expected data quantity in the initial and in future (delta) uploads.
    !! Never buffer the number range object for the characteristic 0REQUEST!!
    417307
    Symptom
    Performance load is too high/not justifiable during data load.
    Customizing settings via extractors IMG path (Transaction SBIW in the OLTP system; can be called directly in the OLTP or using Customizing, from the BW system) do not yield any considerable improvement or are not clear.
    The settings in Table ROIDOCPRMS or in the Scheduler in the BW system are not taken into account by some extractors. How is the data package size defined for the data transfer to the BW? Are there application-specific features to determine the package size? If so, what are they?
    Other terms
    SBIW, general settings extractors, MAXSIZE, data volume, OLTP, service API, data package size, package size, performance
    Reason and Prerequisites
    The general formula is:
          Package size = MAXSIZE * 1000 / size of the transfer structure,
                        but not more than MAXLINES.
    You can look up the transfer structure (extract structure) in Table ROOSOURCE in the active version of the DataSource and determine its size via SE11 (DDIC) -> Utilities -> Runtime object -> Table length.
    The system default values of 10,000 or 100,000 are valid for the MAXSIZE and MAXLINES parameters (see the F1 help for the corresponding fields in ROIDOCPRMS). You can use the IMG Transaction SBIW (in the OLTP system) "Maintain General settings for extractors" to overwrite these parameters in Table ROIDOCPRMS on a system-specific basis. You also have the option of overriding these values in the scheduler (in the target BW system). However, in the Scheduler (InfoPackage) you can only reduce the MAXSIZE. The advantage of using the Scheduler to carry out maintenance is that the values are InfoSource-specific.
    However, some extractors have their own flow logic which MAXSIZE does not load 1:1 from the ROIDOCPRMS.
    This Note does not cover all SAP applications.
    Solution
    Application/DataSource               Standard settings or note
    Generic extractor                     Standard (Example: Note 409641)
    Delta Extraction via DeltaQueue       Standard as of PlugIn 2000.2
                                               Patch 3
    LO-LIS                                 Standard
    Logistic Cockpit SD                       Notes 419465 and 423118
    Logistic Cockpit MM-IM:
    Extraction 2LIS_03_UM                     Notes 537235 and 585750
    Extraction 2LIS_03_BF                     Note 454267
               In general, the following applies to Logistic Cockpit Extraction: The package size set only serves as guideline value.
                Depending on the application, contents and structure of the documents, and the selection in the reconstruction program, the actual size of the transfer packages may differ considerably. That is not an error.
                For 'Queued Delta' update mode the package size of 9999 LUWs (=Logical Unit of Work, in this particular case documents or posting units per transaction are concerned) is set for LO Cockpit. However, if a transaction updated more than 10000 documents at once, the number 9999 would be invalid. This is because an update process cannot be split.
                In the case of a 'Direct Delta' update mode there is no package size for LUW bundling. In the DeltaQueue (RSA1) every LUW is updated individually.
               Also note the following:
                         If you want to transfer large data sets into the BW System, it is a good idea to carry out the statistical data setup and the subsequent data transfer in several sub-steps. In doing so, the selections for the statistical data setup and in the BW InfoPackage must correspond to each other. For performance reasons, we recommend using as few selection criteria as possible. You should avoid complex selections. After loading init delta with selection 1, the setup table has to be deleted and rebuilt with selection 2.
                         Bear the following in mind: The delta is loaded for the sum of all selections from the init deltas. Remember that the selections are made so that they do not impede the delta load, for example, if you have initialized the delta for the periods January 1st, 1999 to December 1st, 2000 and December 2nd, 2000 to August 1st, 2001 then you get a time interval from January 1st, 1999 to August 1st, 2001 in the BW. Documents from August 2nd, 2001 are no longer loaded.
    CO-OM                                     Note 413992
    PS                                        Standard or according
                                               to Note 413992
    CO-PC                                     Standard
    FI (0FI-AR/AP-3)                          First tables with open items(BSID for customers, BSIK for vendors) are read. The package size from the MAXISE field in Table ROIDOCPRMS is used in the extraction from these tables. If the packages are grouped together according to ROIDOCPRMS and a few data records still remain, these remaining records are transferred in one additional package.
    After the open items are extracted, the system extracts from the table of cleared items (BSAD for customers, BSAK for vendors). In this extraction,the package size from the MAXSIZE field is adhered to.
    FI (0FI-/AR/AP/-4 , new as of BW 30A) Like FI-AR/AP-3, but with one difference: if there are remaining records after the system reads the open items using the setting in the MAXSIZE field, they are not transferred in one extra package, but added to the first package of items read from the table of cleared items. For example, if 10 data packages with 15000 data records are extracted from Table BSID in accordance with ROIDOCPRMS, and 400 data records remain, the package size of the first data package from Table BSAD is 15400.
    Both new and changed records are formatted in the following sequence in the delta transfer: 1) new BSIK/BSID records; 2) new BSAK/BSAD records; 3) changed BSIK/BSID records; 4) changed BSAK/BSAD records.
    Package size 0FI_GL_4:
    Prior to Note 534608 and the related notes, package size could vary considerably since the MAXLINES were applied to the document headers only. Then all documents lines for the document headers were read and transferred. As a result, the packages were 2 to 999 times as large as MAXLINES depending on the number of line items per document.
    Note 534608 and the related notes changed the logic so that the MAXLINES is now also applied to the document lines. For each package, MAXLINES can be exceeded by up to 998 lines since a document is always transferred completely in one package. Smaller 'remaining packages' may also occur; for example if MAXLINES = 10000, and 10000 document headers with 21000 lines are selected, 2x10000 and the remainder of 1000 were were transferred in a separate package. Selection logic in 0FI_GL_4: Selection of the new FI documents via CPUDT -> the changed documents are then selected via Table BWFI_AEDAT. When changing the selection from new to changed documents, a package may occur which consists of the 'remainder' of the CPUDT selection and the first package of the BWFI_AEDAT selection. This package can then have a maximum size of 2 x MAXLINES.
    FI-FM                                  Note 416669
    EC-PCA                                 For the most part, the systemadheres to the standard settings but for technical reasons, packages packages smaller than the MAXSIZE or 7200 larger than the MAXSIZE may be omitted in Table ROIDOCPRMS
    FI-SL                                  as in EC-PCA
    PT                                     Standard (refer to Note
                                           397209)
    PY                                     Standard
    PA                                     Standard
    RE                                     Standard
    ISR-CAM (Category Management,
             new as of PlugIn 2001.1)     Standard
    CO-PA                                  During the initialization and fullupdate in the profitability analysis, a join is always read from two tables (For details see Note 392635). To avoid terminations caused by Select statements that run for too long, access occurs with intervals for the object numbers (fixed size 10,000). New intervals are read until the package size requested by BW is reached. Therefore the size of the data package is always equal to or larger than the specification, but it can vary considerably.
    Master data:
    Business partner                       Standard
    Product                                Standard
    Customer                               Standard
    Vendor                                 Standard
    Plant                                  Standard
    Material                               Standard
    567747
    Symptom
    You want to improve the performance of the extraction and loading of your data into SAP BW 3.x.
    Solution
    This is a composite note that deals with performance-relevant topics in the area of extraction and loading.
    If you encounter performance problems, ensure that the current Support Package has been imported.
    This note is continually updated. You should therefore download a new version on a regular basis.
    You will find further documents in the SAP Service Marketplace, alias bw under the folder "Performance".
    Contents:
    I.    Extraction from the OLTP
    II.   Loading generally
    III.  Master data
    IV.   Roll-Up/aggregate structure
    V.    Compression
    VI.   Hierarchies/attribute realignment run
    VII.  DataMart interface
    VIII. ODS objects
    IX.   Miscellaneous
    I.    Extraction from the OLTP
    Note 417307: extractor packet size: Collective note for applications
    Note 505700: LBWE: New update methods from PI 2002.1
    Note 398041: INFO: CO-OM/IM IM content (BW)
    Note 190038: Composite note performance for InfoSource 0CO_PC_01 and
    Note 436393: Performance improvement for filling the setup tables
    Note 387964: CO delta extractors: poor performance for Deltainit
    II.   Loading generally
    Note 130253: Notes on uploading transaction data into BW
    Note 555030: Deactivating BW-initiated DB statistics
    Note 620361: Performance data loading/Admin. data target, many requests
    III.  Master data
    Note 536223: Activating master data with navigation attributes
    Note 421419: Parallel loading of master data (several requests)
    IV.   Roll-Up/aggregate structure
    Note 484536: Filling aggregates of large InfoCubes
    Note 582529: Rollup of aggregates & indexes (again as of BW 3.0B Support Package 9)
    V.    Compression
    Note 375132: Performance optimization for InfoCube condensation
    Note 583202: Change run and condensing
    VI.   Hierarchies/attribute realignment run
    Note 388069: Monitor for the change run
    Note 176606: Apply Hierarchy/Attribute change ... long runtime
    Note 534630: Parallel processing of the change run
    Note 583202: Change run and condensing
    VII.  DataMart interface
    Note 514907: Processing complex queries (DataMart, and so on)
    Note 561961: Switching the use of the fact table view on/off
    VIII. ODS objects
    Note 565725: Optimizing the performance of ODS objects in BW 3.0B
    IX.   Miscellaneous
    Note 493980: Reorganizing master data attributes and master data texts
    Note 729402: Performance when compiling and searching in the AWB

  • How to find out the list of implemented OSS Notes in a particular period.

    Hi Friends,
    Need help.
    I need to find out the list of OSS Notes implemented in a period in a particular system.
    Could you please suggest the table or program or any way to find out the OSS Notes Implemented date wise?
    Thanks in Advance.
    Sreenivas

    Hi Sreenivas.
    How did you find the solution to this? Trying to do the same thing!
    Cheers,
    Tom

  • Oss note problem

    Hi ,
    i applid oss note  1046758. to 2 include and 1 program
    includes:
        J_1IEWT_CERT_TOP
        J_1IEWT_CERT_F01
    program :
           J_1IEWT_CERT
    But i getting error like report/program statement missing.
    plz give me solution
    Edited by: vasanth kandula on Jan 4, 2008 1:25 PM

    Hi,
    open the documentation of that note in Tcode "SNOTE", go through the documentation, there they will propse the piece of code to delete and piece of code to add, check wheather it has been done corectly in your includes and program.
    while applying notes first it will checks for the context block where it has to delete the code and where it has to include the new code, some times due to manual changes there may be a chance of code misplace.
    go through the documentation clearly, it will guide you.
    Reward if useful.
    Thanks,
    Sreeram.

  • Error while transforming XSLT,"Could not compile stylesheet"

    Hi,
    During transformation of my XSLT I needs to fetch data from method named *"myMethod(String str)"*, which is in *"mypackage.test.MyClass"* class. MyClass is in{color:#000000} XXX.jar. {color}
    This XXX.jar is not in context of my web application.*
    Following is part of XSLT which I am using.
    <?xml version="1.0"?>
    <xsl:stylesheet version="1.0" xmlns:aaa="mypackage.test.MyClass">
    <xsl:template match="/responseData">
    <xsl:for-each select="response">
    <XMLResponse>
    <xsl:for-each select="status">
    <xsl:variable name="Vvar_ResResponseType" select="."/>
    <xsl:attribute name="ResResponseType">
    <xsl:value-of select="aaa:myMethod($Vvar_ResResponseType)"/>
    </xsl:attribute>
    </xsl:for-each>
    </XMLResponse>
    </xsl:for-each>
    </xsl:template>
    </xsl:stylesheet>So I tried to use reflection API to load XXX.jar file at runtime.
    But while transforming Transformer does not find "myMethod(String str)" and gives error like "Could not compile stylesheet"
    Following is full exception stacktrace
    ERROR:  'The first argument to the non-static Java function 'myMethod' is not a valid object reference.'
    FATAL ERROR:  'Could not compile stylesheet'
    javax.xml.transform.TransformerConfigurationException: Could not compile stylesheet
    at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTemplates(TransformerFactoryImpl.java:829)
    at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:623)
    at com.actl.dxchange.utilities.Transformation.transform(Transformation.java:83)
    at com.actl.dxchange.base.BaseConnector.transform(BaseConnector.java:330)
    at com.actl.dxchange.connectors.KuoniConnector.doRequestProcess(KuoniConnector.java:388)
    at com.actl.dxchange.connectors.KuoniConnector.hotelAvail(KuoniConnector.java:241)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    ...........Please suggest is there any other way, so that Transformer can find required bean class from XXX.jar duruing transformation process.
    Thanks & Regards,
    Rohit Lad
    Edited by: Rohit_Lad on Jan 29, 2009 7:38 PM
    Edited by: Rohit_Lad on Jan 30, 2009 9:57 AM
    Edited by: Rohit_Lad on Jan 30, 2009 10:02 AM

    Got the solution from forum named
    "Reflections & Reference Objects"
    Following is link for whom encountered this issue.
    http://forums.sun.com/thread.jspa?threadID=5362426
    Edited by: Rohit_Lad on Jan 30, 2009 2:35 PM

  • ABAP dump in ECC 6.0 - Need OSS notes

    ABAP dump is coming when retriving data between select-endselect.
    Runtime Errors         DBIF_RSQL_INVALID_CURSOR
    Except.                CX_SY_OPEN_SQL_DB
    The problem cannot be resolved without correcting the application
    program.
    The statements MESSAGE, COMMIT WORK, ROLLBACK WORK, CALL SCREEN,
    BREAK-POINT, WAIT, CALL FUNCTION ... DESTINATION, CALL FUNCTION ...
    STARTING NEW TASK, RECEIVE RESULTS, CALL SELECTION-SCREEN, CALL
    DIALOG, CALL TRANSACTION cannot be used within a database
    loop.
    You can collect the data needed for these statements together in an
    internal table and process them
    in a loop.
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "DBIF_RSQL_INVALID_CURSOR" "CX_SY_OPEN_SQL_DB"
    Can you please give the OSS notes no. if anything is available.

    Hi Sreedevi..
    In the dump report, the piece of code where the dump occured will be shown
    something with >>>>>> sign in the line...
    Please copy and paste this peice of code..
    Thanks and Best Regards,
    Vikas Bittera.
    **Reward if useful**

  • OSS note is not working in production but works correctly in test system .

    hello friends ,
    weh have implemented one OSS notes , which is working corectly in Test system , but not working in production .
    Could you please let me know , what could be the reason ?
    OSS note no 1064273-
    Regards,
    Manoj

    Pls. check with Basis if the OSS note has been implemented properly and successfully.

  • In what type of HA context should OSS Note 1052984 be applied to?

    Hello,
    I was wondering if somebody could clarify the Term u201CHigh Availabilityu201D (HA) in an SAP Context.
    I installed a PI 7.11 (SID=ABC) system as part of an HA Cluster Failover.
    The u201CABCu201D system is running on Server Host u201C123u201D. 
    I have another Server Host, u201C456u201D, that is part of the HA Cluster Failover.  This server has two other SAP Instances u201CDEFu201D and u201CGHIu201D installed on it.
    When u201CABCu201D needs to failover, due to Hardware failure, etc, the two Instances u201CDEFu201D & u201CGHIu201D are shut down by the HA Cluster Software on host u201C456u201D, and then u201CABCu201D fails over to server u201C456u201D, and runs on it until the original host, u201C123u201D is available again for u201CABCu201D to run on.
    I was reading OSS Note 1052984, u201CProcess Integration 7.1- High Availabilityu201D, and in this noteu2019s context it seems like HA is referring to an Instance where the
    ASCS u2013 (ABAP System Central Services)
    SCS u2013 (JAVA System Central Services)
    DVEBMGS u2013 (Primary Application)
    DB u2013 (Database Instance)
    Are all installed on separate servers, and not necessarily on an u201CHAu201D cluster, where one SAP instance resides on one server and can failover to another node B server, when the need arises.
    Any insight in to what context Note1052984 is referring to would be very helpful.
    Thank you.
    Regards.

    Hi Hiko,
    You might find the answer to your questions on the following links:
    http://www.sdn.sap.com/irj/sdn/ha#section12
    http://www.sdn.sap.com/irj/sdn/index?rid=/webcontent/uuid/b09d53ac-8b0f-2b10-6798-ecfd55894839 [original link is broken]
    Hope this helps!
    Kind regards,
    Mark

  • Problem related to OSS Note - importing file problem

    Hi,
    I m facing an issue while implenting an oss note 430580 , i have to import files from sapserver.
    but i don't know how to do it.
    below is the description of my problem.
    Import the following files from SAPSERV:
    /general/R3server/abap/note.0430580/K000687.PLM
    /general/R3server/abap/note.0430580/R000687.PLM
    Import the two files into your system.
    For more information on SAPSERV, see Note 13719.
    Can any one tell me how can i get rid out of this prioblem.
    whether this is possible or BASIS Team can only do this.
    system details
    Operating system     SunOS
    Machine type         sun4u
    R/3 release 4.6C

    Well, these files are part of the zip.file which belong to the note. This is part of a regular transport and is normally done be basis. The data and the cofile (K....PLM & R...PLM) should be imported into your system. Before doing so they have to be transferred to the file server. Better let BASIS do these kind of things.

  • OSS note 506603 What is it And where can I get it?

    Hello,
    I am trying to create my form with a SAP Table wizard connection, when I try to use the Table wizard I got the error: "No such interface supported". I can see in a lot of comments the solution is on OSS note 506603. I was trying to find this patch in the sdn site but I could not find it. I am not sure if it is a document, an exe file, a dll,,, and when I see the same question asking where to get the OSS note 506603 there is no answer, could somebody help me telling me what is this and where can I get the patch? I would really appreciate it.
    Many thanks.

    Hi Eden,
    How are you doing ?
    OSS note is the older nomenclature for what is now SAP Notes.
    They are available at :
    http://service.sap.com/notes
    you will need a service marketplace user id to log on.
    alternatively, if you do not have access to service.marketplace.
    send me an email [view my business card], i will reply with the patch. it is a few hundred kb.
    with respect,
    amit
    the note number 506603 :-
    Summary
    Symptom
    Additional information for SAP.Net Connector that was not included in the guide.
    Other terms
    SAP .NET Connector, Proxy Wizard
    Solution
    SAP.NET Connector Version 1.0 has been available since Nov 21, 2002. You can download the software from http://service.sap.com/connectors.
    Version 2.0 is also available. Use only the new version for new developments.
    Report any problems with the SAP.NET Connector under the components:
    BC-MID-CON-DNW for problems with the .NET Proxy Wizard for Visual Studio.NET.
    BC-MID-CON-DNC for problems with Communication Runtime.
    Known problems
    1. The Proxy Wizard terminates with the error message "Exception caught in method createFunction for DD_GET_UCLEN Reason: null".
    This is caused by a missing function module, which can be installed as described in Note 580834.
    2. Corrections for SAP.NET Connector 1.0
    The updated version corrects various problems that have been noted during the interim period. You can download it from http://service.sap.com/connectors. To install this version, see the "Release Notes" file. For SAP.NET Connect 1.x, the newest patch-version is called "1.0.3".
    3. Corrections for SAP.NET Connector 2.0
    For Visual Studio Integration (Design Time) following problems have occurred:
    For proxies with several functions, the following runtime error can occur under certain circumstances: "Method xyz can not be reflected." The cause is a combination of an error in the .NET framework and the way in which the proxy generator uses XML-attributes.
    The SAP wizard table does not work. Existing tables are not recognized.
    Both problems are solved by the patch attached to this note. Stop all instances of Visual Studio and unpack the ZIP file into the "%Program Files%\Microsoft Visual Studio .NET 2003/Common7/IDE" directory, whereby the "SAP.Connector.Design.DLL" file is replaced.
    Header Data
    Release Status: Released for Customer
    Released on: 10.12.2004  16:40:26
    Priority: Recommendations/additional info
    Category: Installation information
    Primary Component: BC-OP-NET-NCO SAP .Net Connector
    Secondary Components: BC-OP-NET-VS Visual Studio .Net Connector Wizard
    Message was edited by:
            amit chawathe

  • Details of OSS Note in SAP tables

    Hi Experts,
    Do we have any standard report or a simple way by which I can get all the contents of a specific OSS note without going into SNOTE transaction. The reason why I am asking is for analysing a large volume of OSS notes. If I need to go through SNOTE and then find the objects it will be tedious. I know the table where in these details are there but I want to know if there is any report handy to use.
    Kindly help
    Regards,
    Vijay V

    Hello,
    I'm afraid, but there is no such report in SAP using which you'll get the content os any SAP note.
    Another option than SNOTE transaction is to use SAP marketpplace to findout the same.
    Regards,
    Nitin

  • Clarification about OSS note 677377 - E-mail to credit representative

    Hello,
    in the OSS note mentioned in the subject I find the following statement: "The document can be released directly from the e-mail".
    I have this question: the option to release the document from the e-mail is available only if the workflow is active?
    I made all the settings mentioned in the OSS note but I receive in SAPoffice only the notification with no option to release the SO.
    Thanks and best regards,
    Andrea

    Hi,
    try the following:
    open your workplace in SAP office and go to inbox folder. Click on unread or read documents and documents will be displayed on the right side of the screen. Select one of the documents and click on right button of the mouse and click on execute. You´ll execute the tcode you set in execute parameters on your message.
    Regards,
    JM

  • XSLT Exception: FATAL ERROR:  'Could not compile stylesheet'

    Hi....
    Am getting Folowing Exception while parsing the XSL file. Am using java 1.4 and MSXML4.2 SP2 parser andSDK.
    but when I installed SQLServer2005, with that MSXML 6.0 Parser is Installed.
    It is working fine before instalation of SQLServer2005.
    java.lang.NullPointerException
         at com.sun.org.apache.xalan.internal.xsltc.compiler.FunctionCall.translate(FunctionCall.java:826)
         at com.sun.org.apache.xalan.internal.xsltc.compiler.ValueOf.translate(ValueOf.java:114)
         at com.sun.org.apache.xalan.internal.xsltc.compiler.SyntaxTreeNode.translateContents(SyntaxTreeNode.java:490)
         at com.sun.org.apache.xalan.internal.xsltc.compiler.XslAttribute.translate(XslAttribute.java:252)
    Compiler warnings:
    file:///F:/Data/JRun4/servers/ABC/MainXSL.xsl: line 155: Attribute 'LenderBranchIdentifier' outside of element.
    file:///F:/Data/JRun4/servers/ABC/MainXSL.xsl: line 156: Attribute 'LenderRegistrationIdentifier' outside of element.
    ERROR: 'null'
    FATAL ERROR: 'Could not compile stylesheet'
    javax.xml.transform.TransformerConfigurationException: Could not compile stylesheet
         at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTemplates(TransformerFactoryImpl.java:753)
         at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:548)
         at com.ls.lsglobal.common.CLOUTSQLXML.TransXml2Xml(CLOUTSQLXML.java:274)
    And My XSL is
    <?xml version="1.0" encoding="UTF-8"?>
    <xsl:stylesheet version="2.5" xmlns:lsjava1="com.ls.lsglobal.common" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" exclude-result-prefixes="lsjava1" >
         <xsl:output method="xml" indent="yes" encoding="utf-8" doctype-system="MyRequest.1.3.DTD"/>
         <xsl:template match="/" >
              <REQUEST_GROUP>
                                       <TRANSMITTAL_DATA>
                                            <xsl:if test="AppraisedVal!=''">
                                                 <xsl:attribute name="AppraisedValueAmount"><xsl:value-of select="AppraisedVal"/></xsl:attribute>
                                            </xsl:if>
                                            <xsl:if test="StatedVal!=''">
                                                 <xsl:attribute name="EstimatedValueAmount"><xsl:value-of select="StatedVal"/></xsl:attribute>
                                            </xsl:if>
                                            <xsl:for-each select="ROOT/AppMain/MyLoop/Liab">
                                                 <xsl:if test=" normalize-space(LiabTypCd) = 'SMG' and (normalize-space(PresFutTypCd) = 'BOTH' or normalize-space(PresFutTypCd) = 'PRES') and PresLienPos='1' and normalize-space(RefCd) != 'LOAN'">
                                                      <xsl:attribute name="CurrentFirstMortgageHolderType"><xsl:value-of select="LiabId"/></xsl:attribute>
                                                 </xsl:if>
                                            </xsl:for-each>
                                            <xsl:attribute name="LenderBranchIdentifier">0001</xsl:attribute>
                                            <xsl:attribute name="LenderRegistrationIdentifier"><xsl:value-of select="ROOT/AppMain/MyNum"/></xsl:attribute>
                                       </TRANSMITTAL_DATA>
              </REQUEST_GROUP>
         </xsl:template>
    </xsl:stylesheet>
    And My Java Code is
    public String TransXml2Xml(String xmlInFile, String xslFile, String xmlOutFile) throws Exception
              try
                   DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
                   factory.setIgnoringElementContentWhitespace(true);
                   Document document;
                   File stylesheet = new File(xslFile);
                   File dataInfile = new File(xmlInFile);
                   DocumentBuilder builder = factory.newDocumentBuilder();
                   document = builder.parse(dataInfile);
                   System.err.println("-->AVC:::xslFile="+xslFile+" xmlInFile="+xmlInFile+" xmlOutFile="+xmlOutFile);
                   StreamSource stylesource = new StreamSource(stylesheet);
                   TransformerFactory t=TransformerFactory.newInstance();
                   Transformer transformer = t.newTransformer(stylesource);
                   DOMSource source = new DOMSource(document);
                   javax.xml.transform.stream.StreamResult result = new javax.xml.transform.stream.StreamResult(new File(xmlOutFile));
                   transformer.setOutputProperty(javax.xml.transform.OutputKeys.INDENT , "yes");
                   transformer.setOutputProperty(javax.xml.transform.OutputKeys.ENCODING, "UTF-8");
                   transformer.setOutputProperty("{http://xml.apache.org/xslt}indent-amount", "2");
                   if(strCLUTDTD_PATH.equals(""))
                        java.util.Properties props = new java.util.Properties();
                        java.io.FileInputStream fis = new java.io.FileInputStream("DataFileBasePath.properties");
                        props.load(fis);
                        fis.close();
                        strCLUTDTD_PATH = checkNull(props.get("CLUTREQDTD_PATH"));
                   transformer.setOutputProperty(javax.xml.transform.OutputKeys.DOCTYPE_SYSTEM , strCLUTDTD_PATH);
                   transformer.transform(source, result);
              catch(Exception e)
                   e.printStackTrace();
                   throw e;
              return "";
    can any one know what solution for this problem. It is very help full for me.

    So look at your code and find out what you are passing to the TransformerFactory. Then find out why it's null. Don't just sit there and say "Duh", it's your code.
    And remember that nobody but you can see it yet. If you like you could post it here and then others could see it and comment on it.

  • NewTransformer - Could not compile stylesheet exception

    Using the following code:
    Transformer transformer = factory.newTransformer( new StreamSource(xsltFile) );
    If I fill the xsltFile string variable with "contents.xslt" it goes through fine.
    But, if I put the following into the xsltFile variable, it throws the following exception:
    "C:\\test\\src\\xslt\\contents.xslt"
    Failed to transform file C:\DB Documenter\output\xml\main.xml due to: javax.xml.transform.TransformerConfigurationException: Could not compile stylesheet
    I am formatting the the source and result strings in the actual transform the same way, and they work fine pointing to a c:\ directory. Any reason this blows up on the newTransformer call?
    Here is the whole code:
    public void performTransform(String xsltFile, String xmlFile, String htmlFile) {       
    try {           
    TransformerFactory factory = TransformerFactory.newInstance();
    /* xsltFile = "contents.xslt"; */
    xsltFile = "C:\\DB Documenter\\src\\xslt\\contents.xslt";
    Transformer transformer = factory.newTransformer( new StreamSource(xsltFile) );
    StreamSource xmlSource = new StreamSource(xmlFile);
    StreamResult htmlResult = new StreamResult(htmlFile);
    transformer.transform(xmlSource, htmlResult);
    } catch (Exception e) {
    JOptionPane.showMessageDialog(null, "Failed to transform file " + xmlFile + " due to: " + e);
    }

    StreamSource/Results are simple POJOs: the interpretation of the the system ID you construct them from is left to the interpretation of the object that will use them. This is why the behavior is not consistent.
    For this reason, I never use StreamSource/Result(String) constructors but prefer either the constructor that takes a File or a Reader.

  • TransformerConfigurationException...Could not compile stylesheet

    So far my XSLT stylesheet was executed in operations mapping without problems.
    However I had to make some changes (e.g. adding some additional xsl:choose elements) and now I get the following compilation error:
    TransformerConfigurationException occured when loading XSLT xxxxxxx; details: Could not compile stylesheet
    I develop and change the XSLT externally using XML Spy and there I could run a test transformation without problems or errors.
    Only after reimporting as IA archive and selecting the new version in operations mapping I get the error.
    How is this possible? How can I debug to find the root cause for the error?
    Edited by: Florian Guppenberger on Nov 13, 2009 6:05 PM

    Hello,
    what I have done is to add an additional <xsl:choose> to an already existing <xsl:choose> in the <xsl:otherwise> branch at several locations. Example below.
    I hope that there was not typo somwhere else, however what is very strange is that XML Spy says that the XSLT is well-formed and a test transformation executes successfully too.
    For that XSLT it is not such a big problem, because I can roll-back to the previous version but I am really concerned if you build a big XSLT from scratch in XML Spy an after import in PI you get a compilation error. So it would be really good to find a root cause for this.
    <xsl:choose>
    <xsl:when test="RET_DATE">
      <xsl:attribute name="nullFlavor"><xsl:value-of select="RET_DATE"/></xsl:attribute>
    </xsl:when>
    <xsl:otherwise>
        <xsl:choose>
            <xsl:when test="RET_DATE_VALUE='00000000'">
                 <xsl:attribute name="nullFlavor"><xsl:text>NA</xsl:text></xsl:attribute>
            </xsl:when>
            <xsl:otherwise>
                 <xsl:attribute name="value"><xsl:value-of select="RET_DATE_VALUE"/></xsl:attribute>
            </xsl:otherwise>
          </xsl:choose>  
    </xsl:otherwise>
    </xsl:choose>
    Edited by: Florian Guppenberger on Nov 16, 2009 11:05 AM
    Edited by: Florian Guppenberger on Nov 16, 2009 11:06 AM

Maybe you are looking for

  • Power Mac G4 MDD won't wake from sleep mode

    I have a Power Mac G4 MDD 1.25GHz with a 15" Apple display and running 10.4. I can not wake the computer up from the sleep mode. It will not respond to the mouse moving, a mouse click, a keyboard key, the power button on the display or the power butt

  • Is there a limit to the number of subclips displayed in FCP7?

    I am editing a piece that has two large clips that I have divided into subclips based on the start and stop date and time. I can only display a little over 40 of these subclips in the project window. How can I see the rest of them?

  • SB X-Fi XtremeGamer - Microphone doesn't work

    Hi. Help me solve this stupid problem with microphone. I have been experimenting with all kind of settings in Creative Audio Console, but unfortunately no luck. However I noticed very strange bug, when activating microphone there is some strange anno

  • 2 computers with same information

    I live in Florida and have Time machine working every day. I am moving to NC in 2 weeks where I also have a Dual G5 power pc. Can I just take my Time machine drive, which is an external drive to NC and ask it to copy the contents onto that computer?

  • My toy story app will not open

    Toy Story read along app, I've tried downloading it 3 more times, tried resetting the Ipad but nothing. My daughter who has Autism is very very upset because it isn't working. We just got the Ipad this week end, please help