Debugger Data Export

Hello all,
I am comparing the DSEG data of a program in two different boxes in the debugger, and cannot find a way to export this data. Goto -> System -> System Areas -> DSEG.
Is there is a way to export/print this data? I've done a thorough search over SAP Notes and this forum and cannot find any resolution. Using classic debugger in 4.6C.
Ideas?
Edited by: sappl88 on Jan 3, 2011 7:26 PM

Note 528187 mentions: "It is thus now possible to write the current system area display into a simple text file in the ABAP debugger by entering the function code "SAFI" (save System Area to FIle) - into the current directory of the application server (see profile parameter DIR_HOME)."

Similar Messages

  • Essbase Data Export not Overwriting existing data file

    We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
    The OverWriteFile option is set to ON.
    SET DATAEXPORTOPTIONS {
         DataExportLevel "Level0";
         DataExportOverWriteFile ON;     
    DataExportDimHeader ON;
         DataExportColHeader "Period";
         DataExportDynamicCalc ON;
    The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
    We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
    Any suggestions regarding this issue?

    Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
    This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
    Successful Test Case 1:
    1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
    e.g "Q2FCST-Phased"
    2) Ran the Script
    3) Result Ok.Script overwrote the file with Q2FCST-Budget data
    Successful Case 2:
    1) Put scenario in @member function
    e.g. @member("Q2FCST-Budget")
    2) Results again ok
    Failed Case:
    1) Deleted the file
    2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
    e.g. @member("&ODI_SCENARIO")
    3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
    We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
    Any clues anyone?

  • Automatise data export from website

    Hello,
    I stuck with an issue regarding data export from scopus.com. I have 61 keywords for which I want to export (as csv) number of documents by year, country, document type, subject area. Doing that by hand would be extremely tedious. Thus my question if this can be automatised. Unfortunately scopus does not provide an API to access the data directly.
    So, the process to be automatised would be:
    1. Open http://www.scopus.com/home.url
    2. Enter the keyword in "search for"
    3. Click on "search"
    4. Click on "Analyze results"
    5. Click on tab "year" and "export". Save the file
    6. Click on tab "country" and "export". Save the file
    6. Click on tab "document type" and "export". Save the file
    6. Click on "subject area" and "export". Save the file
    Do programs exist that help me retrieving the data?
    Thanks!

    You could achieve your goal with bash and wget with some combination of --save-cookies --post-data.
    But be aware that scopus is run by Elsevier, whom many view as the most evil of the publishing companies. Think twice before doing something that may annoy them.

  • Unable to stop incorrect date exports

    How do we set up a form in Adobe Acrobat XI that allows dates to be formatted a certain way (mmm/dd/yyyy) and exported in the same way to Excel and always be recognized as a "proper" date in Excel?
    Currently the following does not work (Attempt #1):
    Set up a field; Set the format category as date; Set up a Custom format of "mmm/dd/yyyy"
    Create a distribution file
    When users fill out the form if they type in an incorrect date, eg., "August 27 2013", the form automagically shows the date on the PDF as "Aug/27/13" - Great!
    When the users submit the form and it's brought into the response file the dates are shown in a default date format of mm/dd/yyyy - Fine, once the form owners understand this
    When the form owners export the information the data exported is the same as the original users entered it, not as it was automagically formatted to. For instance, if submitters originally entered "August 27, 2013" then that's what goes across to Excel. And some of these formats Excel doesn't know how to convert. - Understandably frustrating for form owners
    Attempt #2: As a workaround we set up special formatting that has a mask of "AAA/99/9999". This at least provides forces the users to use the same formatting, but is confusing the submitters when they need to enter dates from 1-9 and we've also found that the conversion of this format to a date in Excel doesn't work, but at least it's consistent! Javascript was also added to force users to use specific month abbreviations.
    d = new Date(event.value.replace(/-/g, " "));
    if (!Date.parse(d)){app.alert("Please enter in a proper date.\n\nThe month abbreviations are: Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec");}
    Attempt #3: The last attempt was to continue to use the set up as Attempt #1, but to also use the javascript from Attempt #2. The theory being that if a user entered in "August 27 2013" the javascript would complain. Alas, the javascript appears to run after Adobe automagically does its date format conversion.
    Does anyone know how to get around this or have any other ideas to either enforce a usable date format or have Adobe export the dates as they've been automatically formatted to? We've tried to find a way to turn off the automatic date conversion that Adobe's running, but haven't found a way yet. Another option seemed to be to allow a masking that allowed for optional characters (so that the "0" wouldn't be needed for the dates 1-9) but there doesn't seem to be one.
    Thanks in advance!

    Since there was no clear way to ensure that the date formatting was correct prior to exporting, we're going to get the respondants to use drop downs to ensure the formatting is correct. Not the most convenient for the users though as they're accustomed to being able to type in the values to select it (e.g., for the date of 23 they would expect to enter 2 then 3 for 23) based on other applications, but the Adobe pull downs don't "group" what's been entered (e.g., 2 then 3 will select 30, not 23) and so it will take them a bit to get used to it. I still can't believe that Adobe wouldn't simply export what it's been formatted to though... after all that's what we set the form up for.

  • Data Export error when migrating cucm cluster on version 7.1.5 to cucm 10.0

    Hi
    Has anyone come across below? If so any suggestions for workaround?
     01, 2014 11:54 PDT
       STATUS
        The task has been scheduled.
    Oct 01, 2014 11:54 PDT
         INFO
         Export task action ID #154 with 1 node(s) scheduled.
    Oct 01, 2014 11:54 PDT
      STATUS
               The task has started.
    Oct 01, 2014 11:54 PDT
        INFO 
        Export task action ID #154 with 1 node(s) started.
    Oct 01, 2014 11:54 PDT
       INFO
        Export job for node xx.xx.xx started.
    Oct 01, 2014 12:09 PDT
       ERROR
       Data export failed for node xx.xx.xx.
    Oct 01, 2014 12:09 PDT
      ERROR
        Export job for node xx.xx.xx failed.
    Oct 01, 2014 12:09 PDT
       ERROR
        1 nodes(s) in Export task action ID #154 failed: xx.xx.xx
    Oct 01, 2014 12:09 PDT
       ERROR
        Task paused due to task action failures.

    Hi,
    you can login PCD through putty for seeing the logs
    file list activelog tomcat/logs/ucmap/log4j/ detail date
    further, run 
    file tail activelog tomcat/logs/ucmap/logs4j/ucmap00001.log[for example]
    regds,
    aman

  • What rights are needed to do a Data Export?

    I would like someone from our help desk to be able to do Data-->Export
    in Console1 to export inventory data regularly. When they try this,
    they immediately get the following error:
    Data Export will not proceed. Unable to identify the type of
    installation.
    I saw TID 10088974 but I don't think it applies because if I login to
    the same PC, it works. (Running Console1 locally).
    I also saw another TID indicating user needs Browse rights to the
    ZEN_invDatabase object. [Public] does have rights to this.
    What rights do I need to grant?
    -Marc Johnson

    On Wed, 15 Jun 2005 22:22:40 GMT, Marc Johnson wrote:
    > What rights do I need to grant?
    they will need the read right to the properties of the database object...
    Marcus Breiden
    Please change -- to - to mail me.
    The content of this mail is my private and personal opinion.
    http://www.edu-magic.net

  • BPC10 - Data manager package for dimension  data export and import

    Dear BPC Expers,
    Need your help.
    I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
    I created a test data manager package from Organize > add package > with  process chain /CPMB/EXPORT_MD_TO_FILE  and Add
    In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
    I have not done any chnages in the script inside the task .
    But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
    I have not changed anything there
    in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
    Not sure how to proceed further.
    I shall be greatfull if someone guide me from your experiance  how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and  output file in the advance tab. how and what  transformation file to be created and link to the data manager package for export / import .
    What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
    Thanks in advance for your guidance.
    Thanks and Regards,
    Ramanuj
    =====================================================================================================
    Detals of the task
    Task : APPL_MD-SOURCE
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : EXPORT_MD_CONVERT
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : FILE_TARGET
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    ================================================================================

    1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
    2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
    Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
    Cheers,
    Julius

  • Data Export to Oracle table from essbase issue

    Hello,
    i am using a data export calc script to load data from essbase into an oracle table. We have Essbase 11.1.2.1 on windows 2008 64 bit R2 server. I have an ODBC system DSN created for this job.
    However, when i launch this process i get a message in the log "Cannot read SQL driver name for [Backup Exec Catalogs] from [ODBC.INI]"
    i have checked the ODBC.ini file also in C:\Windows directory and that file contains the connection entries...
    Any thoughts as to why i am getting this error in one server whereas the same process on SIT server is working fine.
    thanks...

    please restart the application and try again . Check the application log if any details are there

  • HR Master Data Export error

    Hi all,
    I'm working at a client where we upgraded from 4.6 to 6.0. This client has never used the master data export program.
    I'm trying to configure the master data IDOCs, but I'm having an error when running the program RPCEMDU0_CALL. It refers  to an OSS note back in 2001, so that really does not apply to this installation. The problem seems to be with function module HR_PU12_UPDATE_T532K_T532L on IT0106. Some sort of data dictionary discrepancies. I haven't been successful in finding any newer notes related to this.
    Has anyone experienced this problem before?
    Thanks in advance for your response.
    Cesar

    Hi Bernd,
    Thanks for your response. I looked at that OSS note but it doesn't seem to be realted to the problem I'm having.
    The error reads:
    "Export program terminated   A009
    Export program obsolete, regeneration required
    Do not regenerate immediately: First make a  $STRUC_P0106 back up of the include
    PERNR N 000008 000000
    (See note 375108) This include is very important if old cluster IF records"
    I've re-activated the program and interface formats, but still getting the same error.
    Thanks again,
    Cesar

  • Problem in data export

    Hi All,
    I have a problem regarding data export, when I used EXP command in my oracle form then EXPORT data from database then the dmp is larger than the dmp when I use to export data from same database using DBMS_DATAPUMP. For the case of EXP dmp size is say 40,225 KB and for the same database when I used DBMS_DATAPUMP it becomes say 36,125 KB. Why this difference is occur? Is this a problem? What will be the solution?
    Please Help ASAP.
    Thanks in advance.
    Regards
    Sanjit Kr. Mahato
    Edited by: userSanjit on Jul 23, 2009 6:19 PM
    Edited by: userSanjit on Jul 23, 2009 6:24 PM

    Hi,
    Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
    No this is not a problem
    Since this is not a problem and hence no solution.
    Why you see this as a problem
    Cheers
    Anurag

  • Data export BR execution Time

    Hi all,
    data export Br is taking more than 13 hours for its execution and after its execution i got an exported file which has only 7000 rows.
    As we have data only at 4 members out of 783 mbrs of a perticular dimension but BR is running for all members.My question is if the data size increase i.e if we get data at all 783 members then time execution of the same BR will be same i.e 13 hours because we are executing BR for the same number of combinations or will it take more than 13 hours.
    Thanks.

    Hi all,
    data export Br is taking more than 13 hours for its execution and after its execution i got an exported file which has only 7000 rows.
    As we have data only at 4 members out of 783 mbrs of a perticular dimension but BR is running for all members.My question is if the data size increase i.e if we get data at all 783 members then time execution of the same BR will be same i.e 13 hours because we are executing BR for the same number of combinations or will it take more than 13 hours.
    Thanks.

  • XML Multiple Data Export/Import into New Form

    Is it possible to import an XML file with multiple form data (exported via "Merge data files into spreadsheet" option, then saved as XML) into a "template" form and create individual forms from the multiple data sheet? In other words, I've merged 65 forms' data into an XML file. Now I'd like to import it all back into an updated form.
    What I've been doing now is exporting the XML data individually for each form and importing each form individually into the new form.
    One option is to extend rights to the user to import nd export themselves, but I'm still looking into the Formrouter service, which, if implemented, won't be for a while.
    Any solutions to this painful process?
    Thanks - Derek
    I just realized this may be a question for a different forum...Acrobat... My apologies.

    Hi Derek,
    Without the LC Enterprise server product(s) I don't think you will be able to achieve this. Acrobat.com give a mechanism of distributing the form, I am fairly sure it will allow you to view the responses in a new form.
    Also applying reader extensions to the form with Acrobat will not help, as this removes the ability of importing/exporting XML. See https://acrobat.com/#d=3lGJZAZuOmk8h86HCWyJKg. If you are extending rights with LC Reader Extensions ES then this restriction should not apply.
    If you have the 65 XML responses, I would be inclined to bite the bullet and manually import the XML into the new form.
    Good luck,
    Niall

  • Data export(ttbulkcp) Oracle TimesTen Question

    I'm trying to export an Oracle TimesTen(TimesTen Release 11.2.1.5.0) with ttbulkcp(Data export), using SQL Developer Version 2.1.1.64
    All other functions are normal operation. Also ttbulkcp(Data export) in Oracle Table is normal operation. But , I get the following error in ttbulkcp(Data export) in TimesTen Table.
    java.lang.NullPointerException
         at oracle.dbtools.raptor.format.ResultsFormatterWrapper.getColumnCount(ResultsFormatterWrapper.java:67)
         at oracle.dbtools.raptor.format.ResultsFormatter.getColumnCount(ResultsFormatter.java:130)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.getColumns(TimesTenLoaderFormatter.java:207)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.printColumnData(TimesTenLoaderFormatter.java:183)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.start(TimesTenLoaderFormatter.java:73)
         at oracle.dbtools.raptor.format.ResultSetFormatterWrapper.print(ResultSetFormatterWrapper.java:150)
         at oracle.dbtools.raptor.format.ResultsFormatter.print(ResultsFormatter.java:200)
         at oracle.dbtools.raptor.format.ResultsFormatter.doPrint(ResultsFormatter.java:416)
         at oracle.dbtools.raptor.dialogs.actions.TableExportAction$5.doWork(TableExportAction.java:637)
         at oracle.dbtools.raptor.dialogs.actions.TableExportAction$5.doWork(TableExportAction.java:634)
         at oracle.dbtools.raptor.backgroundTask.RaptorTask.call(RaptorTask.java:193)
         at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
         at oracle.dbtools.raptor.backgroundTask.RaptorTaskManager$RaptorFutureTask.run(RaptorTaskManager.java:492)
         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
         at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
         at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
         at java.lang.Thread.run(Thread.java:619)
    Driver not capable
    Thank you.
    GooGyum

    If you have a DB support contract, I suggest you open a SR on Metalink/MOS, to get official response and follow-up, else you can hope someone from development picks it up here.
    Regards,
    K.

  • Data Exportation

    I am stumped on a Data Exportation issue. I thought I would post the issue on the forum to see if anyone can help me. We are using Planning 11.1.1.3.
    In essbase we had a calc script written to export data to a txt file. When I ran the calc script recently I noticed a department (dept 101) was not being included in the export file. Here is what has me stumped:
    In the Department dimension, dept 101 sit in the middle of dept 100 and dept 102. The export file pulls in dept 100 and dept 102. I have to think that the CS is wrritten correctly. There is nothing in the CS that singles out dept 101. The CS is written very general.
    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.I realize its probably hard to diagnose the issue without viewing the dimensions, export file, CS, etc. However, any comments/suggestions would be greatly appreciated. Thx!

    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.
    ---- Did you check the storage properities and looked over the possiblity of implied sharing ......(Just an area to watch out for .....Sounds simliar to such case ..But ya Dont have the outline so cant really tell..Just a genrall point..)
    -VD

  • Endeca data export - Single catalog only

    With ATG 10.1.2, is there a way to configure the Endeca data export / Baseline Index process (as run from the ProductCatalogSimpleIndexingAdmin component in dyn admin) to generate data for products and categories in a particular catalog only, instead of for all catalogs?
    Thanks!
    -Jon
    Edited by: Jon Schneider on Apr 9, 2013 5:43 AM

    cmore, thank you, that's very helpful!
    I was looking into this question as a possible solution to an issue where when running the baseline import process as part of the ATG-Endeca integration to upload data from ATG to Endeca, catalog data was not being generated properly, and I was suspicious that part of the problem might be that my application has multiple catalogs.
    The problem turned out to be due to a missing default catalog configuration for my application's site, as described in my posts in this thread: Re: Baseline Index error (invoking from ProductCatalogSimpleIndexingAdmin)
    Since CategoryToDimensionOutputConfig doesn't have a value for the repositoryItemGroup property by default, and it looks like it would be a bit of work to get that set up properly and I no longer have a pressing need to do that, I'm setting this task aside for the time being. cmore, I appreciate your help regardless, though, since this information may be useful to myself or to others in the future. (I'll go ahead and mark your post as the solution.)
    Thanks,
    -Jon

Maybe you are looking for

  • Issue while invoking a stored procedure in DB2 from Oracle OSB flow

    oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: servicebus:/WSDL/Test_Project/Application/Project1/TestSP [ TestSP_ptt::TestSP(InputParameters,OutputParameters) ] - WS

  • ITunes 7.3.1 *SLOW* sync on 30GB iPod

    Hi everyone, Since upgrading to iTunes 7.3.1 my sync time for my 30GB iPod has gone from "quick" to "15 minutes or more." My library is rather small compared to others on this forum (only 25GB). I reloaded my iPod (OS and Data) and it is still slow.

  • 4/15/2014 - Beta - Flash Player 13.0.0.199

    The latest Flash Player 13 builds are now available.  You can download Flash Player here: http://www.adobe.com/go/flashplayerbeta. New Features for Flash Player 13: Enhanced Supplementary Character Support for TextField Characters from the Basic Mult

  • Add a pin to bitlocker startup

    hi all, I updated my T500 to use Bitlocker with the TPM. Now, to further secure it against brute force password attacks, I want to add a PIN to the startup.  I see that gpedit.msc option to Require additional authentication at startup but I am not su

  • Has anybody modified the Aperture Web Themes for a Fluid Grid or Responsive Web design

    I have had some experience modifigying Apple's WebThemes inside Aperture in Aperture.app/Contents/Resources/WebThemes I would like to create or modify an existing Web Theme in order to have my site follow the Fluid Grid principles of Responsive Web d