Export repository to filesystem?

Repostet at How to export repository to file system?
Hello,
within my bachelor thesis I am engaged in parsing ABAP Objects to calculate different code metrics. The analysis tool and many of the known parser-apis do require source-files at the file system layer.
As SAP stores everything in a DB, I wonder if there is a way to export the repository to the file system. And if this is not possible, is there another practicable way to access the source code?
Another point is: Does anyone know a source for a complete grammar (like ebnf or so) for ABAP Objects? that could make my life quite a lot easier...
mfg
- clemens heppner
Edited by: clemens heppner on May 5, 2008 10:26 AM

Thanks for your advice! I've written this report:
START-OF-SELECTION.
  DATA: source TYPE REF TO cl_oo_source,
        source_tab TYPE seop_source_string,
        classname TYPE seoclskey VALUE 'CL_OO_SOURCE'.
  CREATE OBJECT source EXPORTING clskey = classname.
* Copy source->SOURCE table because gui_download( ) needs a changeable
* argument and source->source is read-only.
  source_tab = source->source.
  cl_gui_frontend_services=>gui_download( EXPORTING filename =
'c:\source.ao' CHANGING data_tab = source_tab ).
But that approach has some serious drawbacks:
I have to save each class one by one (is there a way to automate that? P.e. find all classes in one package?)
I can not access the code of reports and such because cl_oo_source requires a classname.
Is there another way or an extension to this one, to solve those problems?
Thanks,
- clemens

Similar Messages

  • How to Export Repository for standalone environments?

    Hi All,
    Can someone help me understand if it is possible to run export repository command and export out certain repositories in case of a stand alone environment?
    Thanks,
    Mathew

    It's not possible to run utility commands like exportRepository, startSQLRepository without ATG installation directory.
    If ATG is installed on the machine where code is deployed, U can use them.
    Otherwise, U can use dyn/admin to export repository data.
    Steps:
    Go to repository component path location. i.e- http://domainname:port/dyn/admin/nucleus/atg/commerce/catalog/ProductCatalog
    Execute <export-items/> command in text area, it will export all items for that repository along with dependencies.
    If u need to export specific item-descriptor use <export-item item-descriptor=""/>
    Regards,
    Nitin.

  • MDM 7.1.4 SP4 don't Export Repository Schema

    Hi,
    I am using SRM-MDM Catalog3.0 ,  MDM 7.1 SP04 , in console ,  to operate  Export Repository Schema failed.
    MDS log:  
    Could not transport expression because it contained a non-transportable element: 'Item Status [Approved]'
    Assignment Code='Set_Status_Code13' of Table Code='MDMSRM_CATALOG_ITEMS' requires additional steps before transport.
    how to solve?
    thanks in advance
    Jesse

    Hi Jesse,
    Please refer SAP Note: 1385073:
    In the Note>Navigate to General>Note the following
    1. A code field was added for Validations and Assignments. All repositories must undergo verify/repair before loading for
    the first time in MDM 7.1 SP04 so that the added codes for Validations and Assignments are implemented.
    2. Validations and assignments from previous versions that contain lookup values will need modifying before the repository schema can be transported as the lookup values in expressions will now internally include the lookup table ID in addition to the value itself. This change was made to ensure that expressions with lookup values are transported correctly.
    In order to modify these expressions you will need to load the repository and delete all the referenced lookup values from
    each expression and then add them back into the expressions. If you try to transport the repository schema before modifying
    the expressions, you will receive a transport error and a message will be generated in the MDS.log stating the assignments or validations that need to be updated.
    Regards,
    Mandeep Saini

  • How to export repository to file system?

    Hello,
    within my bachelor thesis I am engaged in parsing ABAP Objects to calculate different code metrics. The analysis tool and many of the known parser-apis do require source-files at the file system layer.
    As SAP stores everything in a DB, I wonder if there is a way to export the repository to the file system. And if this is not possible, is there another practicable way to access the source code?
    Another point is: Does anyone know a source for a complete grammar (like ebnf or so) for ABAP Objects? that could make my life quite a lot easier...
    mfg
    - clemens heppner

    Thanks for your advice! I've written this report:
    START-OF-SELECTION.
      DATA: source TYPE REF TO cl_oo_source,
            source_tab TYPE seop_source_string,
            classname TYPE seoclskey VALUE 'CL_OO_SOURCE'.
      CREATE OBJECT source EXPORTING clskey = classname.
    * Copy source->SOURCE table because gui_download( ) needs a changeable
    * argument and source->source is read-only.
      source_tab = source->source.
      cl_gui_frontend_services=>gui_download( EXPORTING filename =
    'c:\source.ao' CHANGING data_tab = source_tab ).
    But that approach has some serious drawbacks:
    I have to save each class one by one (is there a way to automate that? P.e. find all classes in one package?)
    I can not access the code of reports and such because cl_oo_source requires a classname.
    Is there another way or an extension to this one, to solve those problems?
    Thanks,
    - clemens

  • Error while doing the repository export

    hi,
    I am getting error while doing the export repository,Message: GEN-13, Additional Message: Dictionary is not available on this database, can anyone help me how to resolve this issue.
    thanks,
    Sundar M

    Hi Sundar,
    Please provide the export repository command, what you used and the log file to review.
    The error "Dictionary is not available on this database" will pop up when Siebel Repository is not present in DB.
    For export process you have to give repository name as "Siebel Repository" and it will export schema in to *.dat file.
    NOTE: Do not change Siebel Repository name for export process.
    Thanks,
    Shilpi

  • Performance KM repository - Filesystem vs. WebDAV

    Hi,
    I'm a newbie to KM repositories so I hope this question makes sense.
    I've got 2 Filesystem Repositories in my KM. One server is remote and reachable via a 4 MBit/s cable, the other one is located within the intranet.
    When I browse both filesystem in EP, the response time of the intranet server filesystem is acceptable, whereas the response time of the remote filesystem is very slow.
    So far not astonishing. But when I browse both filesystems via Windows Explorer both have nearly the same performance (also 4 MBit/s vs Intranet).
    My question:
    1) What could be the reason for that? Is it maybe a different protocol?
    2) Would it maybe help if I try to embed the remote filesystem via WebDAV repository?
    3) What exactly are the differences between WebDAV repository and filesystem Repository? Regarding performance and functioanlity!
    4) Is it possible to use the filesystem via WebDAV and also make changes directly on the fileystem?
    Thanks a lot for your help in advance.
    Joschi

    1) SMB is known not to perform well with bigger latencies (I think). Not sure why this doesn't show in Windows Explorer. Maybe heavy caching?
    2) You can try that.
    3) Performance may be better, because WebDAV protocol exchanges have been designed for this purpose.
    4) Yes.
    Best regards, Julian

  • How to create xsd's for each table in repository instead of entire repos

    Hi Gurus,
    Is there any way i could create xsd's for each table in the repository separately instead of creating single xsd file from "Export repository schema" option which creates a single xml file for the entire repository.I need to create xsd for each table in repository...
    Any Help greatly appreciated
    Thanks
    Aravind

    Open the Lookup table you want the XSD for, in Data manger
    Export it to access.( You can select all the fields you want to export to access and then check option "open Access after export")
    Now in Access, again right click the table and export it to XML.
    Provided you have .NET frame work installed on the machine where you are doing this export, you can do the following:
    Use XSD.exe from command prompt and get the XSD.
    Use the following link as a reference for XSD stuff.
    http://msdn.microsoft.com/en-us/library/x6c1kb0s(VS.71).aspx
    (OR)
    Get the whole XML of the repository and distill the whole structure for Lookups and create XSD using any standard XML editor.

  • Error while exporting Quincy demo data in ATG 9.3

    Hi there, I am trying to export ATG 9.3 quincy demo data from Solid to
    Oracle but I get this strange problem while running the command.
    Environment = Ubuntu 11.10 64 bit, Jboss EAP 4.2, ATG 9.3, Sun JDK 1.6
    Update 26
    Went to ATG Home/bin
    tenzing:$./startSolid -f
    tenzing:$ ./startSQLRepository -m DSSJ2EEDemo -exportRepositories all
    all.xml -repository /atg/userprofiling/ProfileAdapterRepository
    ARGS: args = -m DSSJ2EEDemo -exportRepositories all all.xml -
    repository /atg/userprofiling/ProfileAdapterRepository
    /harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/server/all/lib/
    javax.servlet.jar:/harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/
    server/all/lib/jboss-j2ee.jar
    TEMP IS NOW /tmp
    Before base, JAVA_ARGS=
    CLASSPATH=./locallib/:./lib/launcher.jar:/harbinger/home/ecommerce/
    jboss-eap-4.2/jboss-as/server/all/lib/javax.servlet.jar:/harbinger/
    home/ecommerce/jboss-eap-4.2/jboss-as/server/all/lib/jboss-j2ee.jar:/
    harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/server/all/lib/
    javax.servlet.jar:/harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/
    server/all/lib/jboss-j2ee.jar
    The following installed ATG components are being used to launch:
    ATGPlatform version 9.3 installed at /harbinger/home/ecommerce/ATG/
    ATG9.3
    JAVA_ARGS: -Djava.security.policy=lib/java.policy -Datg.dynamo.home=. -
    Datg.dynamo.root=./.. -Datg.dynamo.display=:0.0 -
    Djava.protocol.handler.pkgs=atg.net.www.protocol -
    Djava.naming.factory.url.pkgs=atg.jndi.url -
    Datg.dynamo.modulepath=./.. -Xms512m -Xmx1024m -XX:MaxPermSize=128m -
    XX:MaxNewSize=128m -Datg.dynamo.server.home=. -
    Datg.dynamo.modules=DAS:DAS-UI::DSSJ2EEDemo:DSS -Datg.dynamo.layers= -
    Dsun.rmi.dgc.server.gcInterval=3600000
    DYNAMO_MODULES: DAS:DAS-UI:DPS:DSS:DSSJ2EEDemo
    CONFIGPATH: /harbinger/home/ecommerce/ATG/ATG9.3/DAS/config/
    config.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DAS/config/oca-
    ldap.jar:vfs[localconfig-1]=/atg/dynamo/service/groupconfig/
    ClientNodeTypeVirtualFileSystem:vfs[localconfig-1]=/atg/dynamo/service/
    groupconfig/ClientInstanceVirtualFileSystem:/harbinger/home/ecommerce/
    ATG/ATG9.3/DAS-UI/config/uiconfig.jar:/harbinger/home/ecommerce/ATG/
    ATG9.3/DPS/config/targeting.jar:/harbinger/home/ecommerce/ATG/ATG9.3/
    DPS/config/oca-cms.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/config/
    oca-html.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/config/oca-
    xml.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/config/
    userprofiling.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/config/
    profile.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DSS/config/
    config.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DSSJ2EEDemo/config:../
    DAS/config/dtmconfig.jar:localconfig:../DAF/config/dafconfig.jar
    CLASSPATH: ./locallib/:./lib/launcher.jar:/harbinger/home/ecommerce/
    jboss-eap-4.2/jboss-as/server/all/lib/javax.servlet.jar:/harbinger/
    home/ecommerce/jboss-eap-4.2/jboss-as/server/all/lib/jboss-j2ee.jar:/
    harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/server/all/lib/
    javax.servlet.jar:/harbinger/home/ecommerce/jboss-eap-4.2/jboss-as/
    server/all/lib/jboss-j2ee.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DSS/
    lib/resources.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DSS/lib/
    classes.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/lib/
    resources.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DPS/lib/
    classes.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DAS-UI/lib/
    uiresources.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DAS-UI/lib/
    uiclasses.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DAS-UI/lib/
    jhall.jar:/harbinger/home/ecommerce/ATG/ATG9.3/DAS/lib/resources.jar:/
    harbinger/home/ecommerce/ATG/ATG9.3/DAS/lib/classes.jar:/harbinger/
    home/ecommerce/ATG/ATG9.3/DAS/lib/servlet.jar:/harbinger/home/
    ecommerce/ATG/ATG9.3/DAS/lib/ice.jar:/harbinger/home/ecommerce/ATG/
    ATG9.3/DAS/solid/SolidDriver2.1.jar
    PATH: /usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/
    sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/lib/jvm/java-6-sun-1.6.0.26/
    bin:/harbinger/home/build/apache-ant-1.8.1/bin:/tmp:/harbinger/home/
    build/apache-maven-2.2.1/bin:/opt/oracle/oracle/product/10.2.0/db_1//
    bin:./../DAS/os_specific_files/i486-unknown-linux2
    **** info Mon Nov 07 10:03:23 CST 2011 1320681803373 /atg/demo/
    QuincyFunds/repositories/Features/Features SQL Repository startup
    complete
    **** info Mon Nov 07 10:03:23 CST 2011 1320681803969 /atg/
    userprofiling/ProfileAdapterRepository SQL Repository startup complete
    Nucleus running
    **** Warning Mon Nov 07 10:03:24 CST 2011 1320681804181
    DistributorSender No remote servers configured
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804200 /atg/content/
    media/MediaRepository SQL Repository startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804251 /atg/
    userprofiling/PersonalizationRepository SQL Repository startup
    complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804314 /atg/demo/
    QuincyFunds/repositories/News/News SQL Repository startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804376 /atg/demo/
    QuincyFunds/repositories/Images/Images SQL Repository startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804461 /atg/demo/
    QuincyFunds/repositories/Funds/Funds SQL Repository startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804533 /atg/demo/
    QuincyFunds/repositories/Email/Email SQL Repository startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804599 /atg/demo/
    QuincyFunds/repositories/InvestmentTips/InvestmentTips SQL Repository
    startup complete
    **** info Mon Nov 07 10:03:24 CST 2011 1320681804669 /atg/demo/
    QuincyFunds/repositories/Offers/Offers SQL Repository startup complete
    exporting repository: /atg/demo/QuincyFunds/repositories/Offers/Offers
    exporting repository: /atg/demo/QuincyFunds/repositories/News/News
    exporting repository: /atg/demo/QuincyFunds/repositories/Email/Email
    exporting repository: /atg/demo/QuincyFunds/repositories/
    InvestmentTips/InvestmentTips
    exporting repository: /atg/demo/QuincyFunds/repositories/Funds/Funds
    exporting repository: /atg/userprofiling/ProfileAdapterRepository
    exporting repository: /atg/content/media/MediaRepository
    exporting repository: /atg/userprofiling/PersonalizationRepository
    exporting repository: /atg/demo/QuincyFunds/repositories/Features/
    Features
    exporting repository: /atg/demo/QuincyFunds/repositories/Images/Images
    **** error exporting data:
    CONTAINER:atg.repository.RepositoryException;
    SOURCE:java.sql.SQLException: [Solid JDBC 02.10.0024] Invalid Date
    **** info Mon Nov 07 10:03:27 CST 2011 1320681807658 /atg/dynamo/
    service/Scheduler Scheduler shutting down.
    **** info Mon Nov 07 10:03:27 CST 2011 1320681807659 /atg/dynamo/
    service/Scheduler Scheduler stopped.
    It fails by saying Invalid Date. I tried searching for this answer but
    did not find it.
    Also started the Jboss server.
    *./startDynamoOnJBOSS.sh -c atg -m DafEar.Admin DSSJ2EEDemo -f -run-in-*
    place  | /harbinger/home/ecommerce/ATG/ATGLogColorizer_v1_2
    and tried to login with jack/jack as mentioned in the document but
    says supplied login was invalid
    the Jboss error log says
    10:24:27,605 ERROR [ProfileItemFinder]
    CAUGHT AT:
    CONTAINER:atg.repository.RepositoryException;
    SOURCE:java.sql.SQLException: [Solid JDBC 02.10.0024] Invalid Date
    at
    atg.adapter.gsa.GSAItemDescriptor.getPersistentItems(GSAItemDescriptor.java:
    4426)
    at
    atg.adapter.gsa.GSAItemDescriptor.getPersistentItems(GSAItemDescriptor.java:
    4003)
    at atg.adapter.gsa.GSAItemDescriptor.getItems(GSAItemDescriptor.java:
    3250)
    at atg.adapter.gsa.GSAItemDescriptor.getItems(GSAItemDescriptor.java:
    2955)
    at
    atg.adapter.gsa.GSAItemDescriptor.executeQuery(GSAItemDescriptor.java:
    7383)
    at atg.adapter.gsa.GSAView.executeUncachedQuery(GSAView.java:332)
    at
    atg.repository.query.QueryCache.executeUncachedQuery(QueryCache.java:
    693)
    at atg.repository.query.QueryCache.populateEntry(QueryCache.java:905)
    at atg.repository.query.QueryCache.executeCachedQuery(QueryCache.java:
    433)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    344)
    at atg.adapter.gsa.GSAView.executeQuery(GSAView.java:281)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    323)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    298)
    at
    atg.userprofiling.RepositoryProfileItemFinder.findByLogin(RepositoryProfileItemFinder.java:
    159)
    at
    atg.userprofiling.RepositoryProfileItemFinder.findByLogin(RepositoryProfileItemFinder.java:
    128)
    at atg.userprofiling.ProfileTools.getItem(ProfileTools.java:808)
    at
    atg.userprofiling.ProfileTools.locateUserFromLogin(ProfileTools.java:
    556)
    at atg.userprofiling.ProfileForm.findUser(ProfileForm.java:2170)
    at atg.userprofiling.ProfileForm.handleLogin(ProfileForm.java:1906)
    at
    atg.scenario.userprofiling.ScenarioProfileFormHandler.handleLogin(ScenarioProfileFormHandler.java:
    541)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
    39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:
    25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.droplet.EventSender.sendEvent(EventSender.java:582)
    at atg.droplet.FormTag.doSendEvents(FormTag.java:791)
    at atg.droplet.FormTag.sendEvents(FormTag.java:640)
    at
    atg.droplet.DropletEventServlet.sendEvents(DropletEventServlet.java:
    523)
    at atg.droplet.DropletEventServlet.service(DropletEventServlet.java:
    550)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.sessionsaver.SessionSaverServlet.service(SessionSaverServlet.java:
    2425)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.AccessControlServlet.service(AccessControlServlet.java:
    602)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.PageEventTriggerPipelineServlet.service(PageEventTriggerPipelineServlet.java:
    169)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.SessionEventTrigger.service(SessionEventTrigger.java:
    477)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.ProfileRequestServlet.service(ProfileRequestServlet.java:
    436)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:
    469)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.URLArgumentPipelineServlet.service(URLArgumentPipelineServlet.java:
    280)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.PathAuthenticationPipelineServlet.service(PathAuthenticationPipelineServlet.java:
    370)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at atg.userprofiling.sso.PassportServlet.service(PassportServlet.java:
    561)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.security.ThreadUserBinderServlet.service(ThreadUserBinderServlet.java:
    91)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:
    212)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:
    1097)
    at
    atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:
    779)
    at
    atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:
    250)
    at atg.filter.dspjsp.PageFilter.doFilter(PageFilter.java:227)
    at
    org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:
    235)
    at
    org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:
    206)
    at
    org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:
    96)
    at
    org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:
    235)
    at
    org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:
    206)
    at
    org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:
    230)
    at
    org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:
    173)
    at
    org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:
    182)
    at
    org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:
    84)
    at
    org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:
    128)
    at
    org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:
    104)
    at
    org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:
    157)
    at
    org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:
    109)
    at
    org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:
    241)
    at
    org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:
    844)
    at org.apache.coyote.http11.Http11Protocol
    $Http11ConnectionHandler.process(Http11Protocol.java:583)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:
    447)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.sql.SQLException: [Solid JDBC 02.10.0024] Invalid
    Date
    at solid.jdbc.SolidTA.s_InvDt(Unknown Source)
    at solid.jdbc.SolidTTs.getDate(Unknown Source)
    at solid.jdbc.SolidResultSet.getDate(Unknown Source)
    at
    org.jboss.resource.adapter.jdbc.WrappedResultSet.getDate(WrappedResultSet.java:
    565)
    at atg.core.jdbc.ResultSetGetter
    $DateGetter.getObject(ResultSetGetter.java:215)
    at
    atg.adapter.gsa.GSAItemDescriptor.getPersistentItems(GSAItemDescriptor.java:
    4284)
    ... 73 more
    SOURCE EXCEPTION:
    java.sql.SQLException: [Solid JDBC 02.10.0024] Invalid Date
    at solid.jdbc.SolidTA.s_InvDt(Unknown Source)
    at solid.jdbc.SolidTTs.getDate(Unknown Source)
    at solid.jdbc.SolidResultSet.getDate(Unknown Source)
    at
    org.jboss.resource.adapter.jdbc.WrappedResultSet.getDate(WrappedResultSet.java:
    565)
    at atg.core.jdbc.ResultSetGetter
    $DateGetter.getObject(ResultSetGetter.java:215)
    at
    atg.adapter.gsa.GSAItemDescriptor.getPersistentItems(GSAItemDescriptor.java:
    4284)
    at
    atg.adapter.gsa.GSAItemDescriptor.getPersistentItems(GSAItemDescriptor.java:
    4003)
    at atg.adapter.gsa.GSAItemDescriptor.getItems(GSAItemDescriptor.java:
    3250)
    at atg.adapter.gsa.GSAItemDescriptor.getItems(GSAItemDescriptor.java:
    2955)
    at
    atg.adapter.gsa.GSAItemDescriptor.executeQuery(GSAItemDescriptor.java:
    7383)
    at atg.adapter.gsa.GSAView.executeUncachedQuery(GSAView.java:332)
    at
    atg.repository.query.QueryCache.executeUncachedQuery(QueryCache.java:
    693)
    at atg.repository.query.QueryCache.populateEntry(QueryCache.java:905)
    at atg.repository.query.QueryCache.executeCachedQuery(QueryCache.java:
    433)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    344)
    at atg.adapter.gsa.GSAView.executeQuery(GSAView.java:281)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    323)
    at
    atg.repository.RepositoryViewImpl.executeQuery(RepositoryViewImpl.java:
    298)
    at
    atg.userprofiling.RepositoryProfileItemFinder.findByLogin(RepositoryProfileItemFinder.java:
    159)
    at
    atg.userprofiling.RepositoryProfileItemFinder.findByLogin(RepositoryProfileItemFinder.java:
    128)
    at atg.userprofiling.ProfileTools.getItem(ProfileTools.java:808)
    at
    atg.userprofiling.ProfileTools.locateUserFromLogin(ProfileTools.java:
    556)
    at atg.userprofiling.ProfileForm.findUser(ProfileForm.java:2170)
    at atg.userprofiling.ProfileForm.handleLogin(ProfileForm.java:1906)
    at
    atg.scenario.userprofiling.ScenarioProfileFormHandler.handleLogin(ScenarioProfileFormHandler.java:
    541)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
    39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:
    25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.droplet.EventSender.sendEvent(EventSender.java:582)
    at atg.droplet.FormTag.doSendEvents(FormTag.java:791)
    at atg.droplet.FormTag.sendEvents(FormTag.java:640)
    at
    atg.droplet.DropletEventServlet.sendEvents(DropletEventServlet.java:
    523)
    at atg.droplet.DropletEventServlet.service(DropletEventServlet.java:
    550)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.sessionsaver.SessionSaverServlet.service(SessionSaverServlet.java:
    2425)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.AccessControlServlet.service(AccessControlServlet.java:
    602)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.PageEventTriggerPipelineServlet.service(PageEventTriggerPipelineServlet.java:
    169)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.SessionEventTrigger.service(SessionEventTrigger.java:
    477)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.userprofiling.ProfileRequestServlet.service(ProfileRequestServlet.java:
    436)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:
    469)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.URLArgumentPipelineServlet.service(URLArgumentPipelineServlet.java:
    280)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.PathAuthenticationPipelineServlet.service(PathAuthenticationPipelineServlet.java:
    370)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at atg.userprofiling.sso.PassportServlet.service(PassportServlet.java:
    561)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.security.ThreadUserBinderServlet.service(ThreadUserBinderServlet.java:
    91)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:
    212)
    at
    atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:
    135)
    at
    atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:
    1097)
    at
    atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:
    779)
    at
    atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:
    250)
    at atg.filter.dspjsp.PageFilter.doFilter(PageFilter.java:227)
    at
    org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:
    235)
    at
    org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:
    206)
    at
    org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:
    96)
    at
    org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:
    235)
    at
    org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:
    206)
    at
    org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:
    230)
    at
    org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:
    173)
    at
    org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:
    182)
    at
    org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:
    84)
    at
    org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:
    128)
    at
    org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:
    104)
    at
    org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:
    157)
    at
    org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:
    109)
    at
    org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:
    241)
    at
    org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:
    844)
    at org.apache.coyote.http11.Http11Protocol
    $Http11ConnectionHandler.process(Http11Protocol.java:583)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:
    447)
    at java.lang.Thread.run(Thread.java:662)
    Any help is most soughted for. Thanks in Advance

    Solved

  • SRM MDM 3.0 BMECAT Repository Schema

    Hello at all,
    I have currently run the SRM-MDM 3.0 with the standard respository delivered as archive on the install CD. I still have created some data with the data manager and it runs all very well.
    Now I want to import a catalog based on BMECat 1.2 standard. I Have the complex catalog data XML from the vendor Bechtle and the images folder for the articles.
    When running the Import Manager there are many fields and tables in the source xml that do not excist in the destination hierachy (repository).
    My question is how do I create a repository based on the BMECat 1.2 standard to import the vendor catalog xml ?
    What I still tryed was. Creating a new repository. It's empty except of the standard table like Admin, Images, Products etc.
    Than go to Transport -> Import Repository Schema -> choose the vendor XML
    That doesn' work. I get the Error Message: "There are problems with the transport (XML) which prevent it from being processed"
    I have learned that an XML schema data is described with the ending .xsd but when I try to import the databrowser looks for .xml endings.
    Could someone please advice what's I have done wrong?
    Thanks a lot.
    Rico

    Hi Rico,
    I am not aware of BMEcat standard but in MDM terms you can take the backup of repository in two forms:
    1. Archive/UnArchive -> If you want to move the repository from one MDM server to another i.e. from dev to QA then you can Archive the repository which creates the .a2a file and place the file in the Archives folder of MDM Server installation directory which can then be copied and moved to Archives folder of target MDM server and you can UnArchive it on that MDM server.
    Archive = Repository Schema + Data
    2. Export / Import Repository Schema -> With this method you can export the schema of the repository and create the new repository on some other MDM Server using Create Repository from Schema command. When you perform the export repository schema command, it will ask for the file name and path. You can then copy this schema file and place in the file system of another server.
    Export / Import Repository schema = Repository Schema only
    Regards,
    Jitesh Talreja

  • Import file  export  instead of  application export

    Hi All
    please help
    i exported my application, now i wanted to do an import to the file
    i did import ------->
    import file : i choose the application export file
    file type : i choose instead of application,page or component export i choose by accidently :*file export*
    did NEXT -> it wrote : Successfully Imported File
    then it wrote If you wish to to Install now, click Next >.
    i did NEXt : it wrote to me in the header :Install Static Files
    logget in as : my user
    install file workspace : name of the workspace
    but i did not push the button of install static files. then i saw that this is not what i won't to do . my quetsion is did i do any harm?
    is the statis files (which i don't really know waht it is ) are sitting on the server , or in the application ? please tell me, is there anything that i need to cancel? i saw in the export ripository that it shown as static files export.
    note : as a person who develope on this application , i've never, use an import of static files.
    inspite of all this, i started to do an import for my application which the suitable file file type , and i stopped in the middle.
    i'll be grateful for a quick response!

    Hello Arie !
    thanks for quick response.
    i've choose onw of the two application which i currently have on my workspace and did what you've wrote with the share componennts -> static files , it right to me no static files. i did the same with my other application , and i got the same message, maybe i'm getting this message because i did not do the install operation ? . however share components are related to the application that i've choose and not to all the apex, right ?
    the thing that i don't get is if i get:no static files found,so why in the Export Repository i'm getting about one application export file:static file export?
    can i delete it from there ?

  • Is it possible to import and export Config Toll Configuration  from one sys

    Hi All,
    Is it possible to import and export Config Toll Configuration  from one system to Another system (QUS/PRD), for especific service.
    Kindly let me know the pro and corn of it and step by step process
    Thanks alot for your time.
    Thanks
    AB

    Yes...It is certainly possible but then you would need to bring OS level j2ee filestructure as well and there are lots of changes at OS level in *.properties file and then at configtool level changes related to hostname, system numbers and port numbers etc.
    This is not much difficult but not impossible also if you do it very carefully. Please note that SAP DOES NOT SUPPORT THIS METHOD.
    alternatively, you can use sapinst to export-import j2ee filesystem from source to target which inturn would require to do configtool export/import and then changes at configtool level.
    Do let us know your requirement so that we cud help you in case you are facing any issues.
    cheers !!!
    Ashish

  • [package upgrade] Can't update udisks due to filesystem conflicts

    Hello everybody,
    after not updating my box for like two weeks I wanted to update it this morning.
    Unfortunately the upgrade process hangs due to several file conflicts within the udisks package.
    $ sudo pacman -Syu
    :: Synchronizing package databases...
    core is up to date
    extra is up to date
    community is up to date
    multilib is up to date
    :: Starting full system upgrade...
    resolving dependencies...
    warning: dependency cycle detected:
    warning: lib32-harfbuzz will be installed before its lib32-freetype2 dependency
    looking for inter-conflicts...
    Packages (143):
    ...truncating package names for the sake of readibility...
    udisks-1.0.5-1 udisks2-2.1.3-1
    Total Installed Size: 2778.41 MiB
    Net Upgrade Size: 42.67 MiB
    :: Proceed with installation? [Y/n] y
    (143/143) checking keys in keyring [###############################################] 100%
    (143/143) checking package integrity [###############################################] 100%
    (143/143) loading package files [###############################################] 100%
    (143/143) checking for file conflicts [###############################################] 100%
    error: failed to commit transaction (conflicting files)
    udisks: /etc/avahi/services/udisks.service exists in filesystem
    udisks: /etc/dbus-1/system.d/org.freedesktop.UDisks.conf exists in filesystem
    udisks: /usr/bin/udisks exists in filesystem
    udisks: /usr/bin/udisks-tcp-bridge exists in filesystem
    udisks: /usr/bin/umount.udisks exists in filesystem
    udisks: /usr/lib/systemd/system/udisks.service exists in filesystem
    udisks: /usr/lib/udev/rules.d/80-udisks.rules exists in filesystem
    udisks: /usr/lib/udev/udisks-dm-export exists in filesystem
    udisks: /usr/lib/udev/udisks-part-id exists in filesystem
    udisks: /usr/lib/udev/udisks-probe-ata-smart exists in filesystem
    udisks: /usr/lib/udev/udisks-probe-sas-expander exists in filesystem
    udisks: /usr/lib/udisks/udisks-daemon exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-ata-smart-collect exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-ata-smart-selftest exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-change-filesystem-label exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-change-luks-password exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-create-partition exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-create-partition-table exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-delete-partition exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-drive-benchmark exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-drive-detach exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-drive-poll exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-fstab-mounter exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-linux-md-check exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-linux-md-remove-component exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-mdadm-expand exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-mkfs exists in filesystem
    udisks: /usr/lib/udisks/udisks-helper-modify-partition exists in filesystem
    udisks: /usr/share/bash-completion/completions/udisks-bash-completion.sh exists in filesystem
    udisks: /usr/share/dbus-1/interfaces/org.freedesktop.UDisks.Adapter.xml exists in filesystem
    udisks: /usr/share/dbus-1/interfaces/org.freedesktop.UDisks.Device.xml exists in filesystem
    udisks: /usr/share/dbus-1/interfaces/org.freedesktop.UDisks.Expander.xml exists in filesystem
    udisks: /usr/share/dbus-1/interfaces/org.freedesktop.UDisks.Port.xml exists in filesystem
    udisks: /usr/share/dbus-1/interfaces/org.freedesktop.UDisks.xml exists in filesystem
    udisks: /usr/share/dbus-1/system-services/org.freedesktop.UDisks.service exists in filesystem
    udisks: /usr/share/locale/da/LC_MESSAGES/udisks.mo exists in filesystem
    udisks: /usr/share/man/man1/udisks-tcp-bridge.1.gz exists in filesystem
    udisks: /usr/share/man/man1/udisks.1.gz exists in filesystem
    udisks: /usr/share/man/man7/udisks.7.gz exists in filesystem
    udisks: /usr/share/man/man8/udisks-daemon.8.gz exists in filesystem
    udisks: /usr/share/pkgconfig/udisks.pc exists in filesystem
    udisks: /usr/share/polkit-1/actions/org.freedesktop.udisks.policy exists in filesystem
    Errors occurred, no packages were upgraded.
    Anyone got a clue? I could not find anything on the forums or on the homepage.
    Thanks in advance!

    Hi DerAlex, welcome to the forums.
    Please read the following part of the wiki, which explains how yoou should proceed in this situation: https://wiki.archlinux.org/index.php/Pa … stem.22.21
    Last edited by WorMzy (2014-03-21 08:49:21)

  • Solaris Logical Disks (Filesystems) not being discovered. WinRM CIM_ERR_INVALID_CLASS Error.

    Hello,
    currently I am testing the Solaris monitoring with different kind of Solaris servers. We have one single system, a RAC database cluster of two servers, and one "Global-Zone" server hosting a "Non-Global-Zone". We're using SCOM
    2012 SP1 with Agent Version 1.4.0-906 (sparc).
    My problem is, that only on one of the 5 servers the Logical Disks (Filesystems) are discovered correctly. All Solaris systems are using ZFS.
    The one that works just fine is the single system. In Health Explorer under Hardware Availability Rollup I can find every filesystem that is also configured in the /etc/vfstab. A check with WinRM from the Management Server works fine (winrm enumerate hxxp://schemas.microsoft.com/wbem/wscim/1/cim-schema/2/SCX_FileSystem?__cimnamespace=root/scx
    -auth:basic -remote:https://<servername>:1270 -username:scomuser -password:<pass> -skipCAcheck -skipCNcheck -encoding:utf-8)
    The same WinRM command does not work for the other servers. The WinRM command failes with following error code:
    Fault
        Code
            Value = SOAP-ENV:Sender
            Subcode
                Value = wsa:DestinationUnreachable
        Reason
            Text = CIM_ERR_INVALID_CLASS
        Detail
            FaultDetail = hxxp://schemas.dmtf.org/wbem/wsman/1/wsman/faultDetail/InvalidResourceURI
    Error number:  -2144108485 0x8033803B
    The WS-Management service cannot process the request. The resource URI is missing or it has an incorrect forma
    t. Check the documentation or use the following command for information on how to construct a resource URI: "w
    inrm help uris".
    As far as I can see in the scx.log, there are some messages like:
    2013-04-25T12:19:33,668Z Warning    [scx.core.common.pal.system.disk.diskdepend:6241:7] No link exists between the logical device "rpool12/export/home" at mount point "/export/home" with filesystem "zfs". Some
    statistics will be unavailable.
    2013-04-25T14:10:34,775Z Info       [scx.core.providers.diskprovider:6241:11] BaseProvider::EnumInstances() - Invalid class - //<servername>/root/scx:SCX_FileSystem
    2013-04-25T14:13:27,810Z Warning    [scx.core.providers.osprovider:6241:11] BaseProvider::EnumInstances() - Calling opendir() returned an error with errno = 24 (Too many open files) - [/export/home/serviceb/ScxCore_URSP1_SunOS510_sparc/source/code/shared/scxsystemlib/process/processenumeration.cpp:58]
    Do you have any idea what is going wrong and how to fix it?
    Thanks & Regards,
    Holger

    Holger,
    I believe it might be the (Too many files open) issue causing your problems. Try the following and let me know if it helps...
    First confirm if this is the issue. Run the following command to find out the Solaris server's current file descriptor limit:
    ulimit -n
    Next determine the number of file descriptors in use by the SCX Agent processes. A list of file descriptors is located under /proc/<Process ID>/fd, where <Process ID> is the process ID of the agent process running on the Solaris server which
    can be obtained using the ps command. Or use the following command to list the PID’s and number of file descriptors for any processes containing microsoft/scx which should cover all of the SCX Agent processes:
    for i in `ps -ef | grep microsoft/scx | grep -v grep | awk '{print $2}'` ; do echo PID $i: `ls -l /proc/$i/fd | wc -l` ; done
    Note: You must execute the above commands using sh, bash, or other Bourne-compatible shell. Be sure the commands are on a single line. If the result is a > sign, then press CTRL-C, check to make sure quotes and backticks are closed, and try again.
    If the number of file descriptors returned is near the file descriptor limit obtained using ulimit, follow this procedure to increase the limit by putting a ulimit command in the SCX Agent startup script.
    1.      
    Find and edit the scx-cimd startup script, which should be located at /opt/microsoft/scx/bin/tools/scx-cimd in Solaris 10 and 11, or at /etc/init.d/scx-cimd in Solaris 8 and 9:
    vi /opt/microsoft/scx/bin/tools/scx-cimd
    2.   In the file, locate the section responsible for starting the agent daemon.
    start)
            # Start daemon
            echo "Starting $DESC"
            $DAEMON $OPTIONS
            exit $?
    3.      
    Below the echo statement, but before the line with $DAEMON $OPTIONS, insert this command on its own line:
    ulimit -n 1024
    The end result should look like:
    start)
            # Start daemon
            echo "Starting $DESC"
            ulimit -n 1024
            $DAEMON $OPTIONS
            exit $?
    4.      
    Save the file and restart the agent:
    scxadmin -restart all
    Regards,
    -Steve

  • Create new repository from schema

    The option "Create new repository from schema" is not appearing in my MDM console.
    How can I add this?
    MDM 5.5

    Hi,
    I am on MDM 5.5 SP6 Patch 1 and the option is visble here.
    May be on smaller version this option is not there.
    well, on right click on your Repository level , Do u get option Export Repository Schema and Import Repository Schema.
    if yes, you can achieve the same, firstly create a new repository and then use option Import Repostory schema, you will get the same result as "Create new repository from schema"
    Hope it wil Help you,
    Rewards if Useful....
    Mandeep Saini
    Edited by: Mandeep Saini on Jun 17, 2008 5:24 PM

  • (fileSystem::extract Blob From Local Path) after installing B.O. XI 3.1

    Hello,
    I've installed B.O. XI 3.1 client tool on a laptop, but when I try to run Designer, or Deski the following error occurs (REPOSITORY ERROR (fileSystem::extract Blob From Local Path) file not found) and the application quits.
    Anyone has an idea?

    Hi
    What operating system are you using on your laptop?
    I have a colleague having a similar problem using Windows Vista, whereas I am fine (using XP).
    Did you manage to resolve this issue? (And if so, how)
    regards, Lara

Maybe you are looking for

  • Availability  check in Enjoy Purchase Order & Requisition

    Hi Experts, The availability check is used in 4.7E also but what is difference in Ecc 6.0 or new functionality added Can any one plz let me know Regards Pratap

  • FI-MM integration cost flow

    Dear Experts; FI-MM integration, how the costs flows? that is frm were inventory cost is updating, frm were GR/IR cost is updating?- Material maseter or PO cost or invoic cost Also , how we dealt with price differene, PRD? ( i have knowledge about do

  • Main scenarios in SAP XI in a business environment

    Hi, I am a starter in XI and would like to know ..what are the different scenarios that we come across in a real time environment while working with sap xi. Help will be greatly appreciated... Thanks in advance, vas

  • Output html is not W3C valid

    Hi, Would hear if you know why simple tags aren't compatible with doctype declaration? Or am I missing something here...? <?xml version='1.0' encoding='UTF-8'?> <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.0" xmlns:f="http://java.sun

  • Expense Report Export Program keeps on running normally for hours

    Hi and assalam o alay kum Gurus I am facing a problem in creating of invoice through expense report export program, I create expense report and run the program "Expense Report Export" but the issue is the program keeps on running for several hours we