Enabling data management for  a coldfusion  data service

Hello,
    i have problem in calling a service withen a data service ,, that if i enable the data management  i cant insert or update records , if i desable the data manabemt i can insert records but dont have result for get all or any get service all .  it was working befor now no i completly remove the data service and created it again but the same problem even i deleted the valueobject and all the actionscript file related to that data service i have the same problem.
but
if i call the create record function ,, then i try to get the record i get the resutl from the database ?!!!
can someone help please

I saw the tutorial but that doesn't actually show how to configure for POST - it just uses GET and DELETE. When you configure for POST, you get prompted to "Specify an XSD that describes the structure of the request content". That implies that it is going to send the service an XML file as the attachment and wants to know the format of the XML file. The service that I want to use doesn't expect to receive an XML file, it expects form fields, and I think HTML forms send these like this:
p_zip_code=22209
p_selection=a
p_selection=b
p_selection=cOf course, the correct answer to my question may be "You can't get there from here" at least until 12c.
Ah ha! I see that some of what Frank mentioned is in the newly released 11.1.2.2, and didn't wait for 12c. I'll give that a try.
Edited by: jflack on May 11, 2012 2:58 PM

Similar Messages

  • How I can do data management for classification ?

    Hi
    How I can do data management for classification.
    If I have new class and I want to add this class to a lot of materials, how can I do that? Is there a transaction to add and also to manage the data in the classification by mass?
    Vijay

    CLMM is the transaction for mass maintenance.
    there you click set an change values button
    you enter your class type.
    in assingments tab you enter the new class
    in target objs to enter the material number that need to get this new class assigned. (you can search the materials with the class they have currently and adopt the hits)

  • Labview How to specify 1d array of clusters as data types for variant to data

    Hi, I'm new to labview. Can anyone tell me how to specify 1d array of clusters as data types for variant to data?

    First of all, you should be sure that there is such a data type within the variant; otherwise, you will run into errors.
    I recommend you to create the cluster and create a type definition from it. Then drop an array shell from the array palette and drop the cluster type into that array.
    Connect that constant to the data type input of the Variant To Data function.
    Norbert
    CEO: What exactly is stopping us from doing this?
    Expert: Geometry
    Marketing Manager: Just ignore it.

  • Enable document management for entities through PowerShell script (Dynamic CRM 2013 on premises)

    Hello,
    Can anybody let me know if it is possible to enable document management for entities through PowerShell script for Dynamic CRM 2013 on premises.
    I want power shall script where user will give the entity (Accounts, Contacts etc.)   for the CRM.
    The script should enable the document management for the entity.
    Thank you for your support.

    Hi Jeff,
    Any updates? If you have any other questions, please feel free to let me know.
    A little clarification to the script:
    function _ErrObject{
    Param($name,
    $errStatus
    If(!$err){
    Write-Host "error detected"
    $script:err = $True
    $ErrObject = New-Object -TypeName PSObject
    $Errobject | Add-Member -Name 'Name' -MemberType Noteproperty -Value $Name
    $Errobject | Add-Member -Name 'Comment' -MemberType Noteproperty -Value $errStatus
    $script:ErrOutput += $ErrObject
    $errOutput = @()
    _ErrObject Name, "Missing External Email Address"
    $errOutput
    _ErrObject Name "Missing External Email Address"
    $errOutput
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna Wang
    TechNet Community Support

  • Can anyone confirm the date used for pushing the data into AR interface table? Is it abse don Actual ship date or scheduled ship date?

    Can anyone confirm the date used for pushing the data into AR interface table? Is it abse don Actual ship date or scheduled ship date? We are facing a scenario where trx date is lower than the actual ship to which logically sounds incorrect.
    Appreciate any quick response around this.

    Hi,
    Transaction date  will be your autoinvoice master program submission level date (If you haven't setup any logic.
    Please check the program level default date, if user enter old date ststem will pick the same.
    Customer is trying to set the value of the profile OM:Set receivables transaction date as current date for non-shippable lines at the responsiblity level. System does not set the transaction date to current date in ra_interface_lines_all.
    CAUSE
    Customer has used the functionality in R11i. But after the upgrade to R12, the system functions differently than R11i.
    SOLUTION
    1.Ensure that there are no scheduled workflow background process run.
    2.Set the profile "OM: Set Receivables Transaction Date as Current Date for Non-Shippable Lines"  at Responsibility level only as Yes.
    3.Now switch the responsibility to which the profile is set.
    4.Create order for Non-Shippable Lines and progress it to invoicing.
    5.Ensure that the 'workflow background process' concurrent program is run in the same responsibility and this line is considered in it.
    6.Now check if the 'SHIP_DATE_ACTUAL' is populated to ra_interface_lines_all

  • LiveCycle 2.6.1 Data Management with The ColdFusion 8.0 DataManagement Event Gateway Issue

    Hello all,
         I've recently been developing a project that involves sending out events from ColdFusion to LiveCycle 2.6.1 using the Data Management event gateway to Flex 4.0 clients (LiveCycle and ColdFusion are on different Instances, but the same server).  To begin with, I used ColdFusion assemblers, DAO's, and models and everything worked fine locally.  After deploying this setup to a beta site, I decided that this setup would be very troublesome in terms of configuring clustered instances across multiple servers.  I then decided to convert my assemblers, DAO's, and models to Java.  The conversion went well and the flex clients see the exact same data as they did with the ColdFusion adapter.
         Once I tried to send an update through from my ColdFusion application to a Flex client, I get an error stating that:
    "Unable to find the Flex adapter for destination My_Dest in the RMI registry on localhost:1099.The Flex adapter may not be running or the destination may be incorrect."
    After seeing this error, downloaded a Java-based RMI inspector to see what was going on.  To get a good idea of what was happening when the ColdFusion adapter was being used, I switched my data-management-config file back to the CF adapter.  I noticed that the RMI entry was as follows:
    localhost:1099/cfdataserviceadapter/My_Dest
    localhost:1099/cfassembler/my_cf_instance
    Once I gathered this data as the base, I converted back to the Java adapter in my data-management-config file, restarted the servers, and ran the RMI inspector again.  Only the "localhost:1099/cfassembler/my_cf_instance" was showing.  (This one shows because I have "Enable Remote Adobe LiveCycle Data Management Access" checked in my CF instance's CF Admin -> Flex Integration).  Since I don't need this checked anymore, I unchecked it and re-ran the RMI inspector.  As it should, the "localhost:1099/cfdataserviceadapter/My_Dest" went away.  Since no destination shows up, it means that the Flex adapter isn't registering my "my_Dest" destination with RMI.  Since it isn't registered, I can't see it when I try to send a message through the CF Data Management event gateway.
    Can anyone help me out here?  I certainly may be missing something when it comes to RMI (I don't work with Java very often).  Any advice would be greatly appreciated!
    Thank you,
    Dustin Blomquist

    Dustin,
    Without the ColdFusion based data management destination defined on the LCDS server, the destination will not show up in the RMI registry.  It is only the CF adapter code that does this.  The 'stock' LCDS adapter does not support invoking via RMI the way the CF version does.
    I would recommend you run the LCDS MessageBrokerServlet inside the ColdFusion web application.  This will give you two things:
    1. You will not have the overhead of RMI between CF and LCDS as they will share the same VM (better performance!).
    2. You will be able to use the CF Data Management Gateway to pass messages to Java-based destinations.  The APIs the gateway uses should work fine with either CF or Java based Data Management destinations.
    The CF/LCDS integration doesn't support what you are trying to do when you run two seperate instances.

  • System and Query field disable in Data Manager for Netweaver BI

    Dear All.
    i have installed Xcelsius Engage Server 2008, when i try to add connection for SAP Netweaver BI from the Data Manager the dialog appears correctly but in Defination TAB only the Name field is enable both System and Query field is disable.
    without that how can i configure the connection please let me know how to fix this issue.
    Kind regards,

    Hi,
    You give the name of the connection and then click on "Browse".  It will then prompt you to connect to the desired system.  Log in and then select the appropriate query you want to build a dashboard on.
    Hope this helps.
    Regards,
    RashmiG

  • Automate  Data Manager - for particular infocube in BW

    Experts,
    How to run process chain /CPMB/LOAD_INFOPROVIDER in the RSPC and hard code it to certain Infocube in BW.
    OR
    is there a process type which loads data from BW cube to BPC cube using data manager and NOT switching the cubes?
    I see that we can use program UJD_TEST_PACKAGE to automate ,but my DM is not running.
    I gave the answer prompts below ,can u please let me know if it is correct or the right syntax and method ?
    also does the userid need to be SYSTEM userid ?
    %InforProvide%     ZCUBENAME     
    %TRANSFORMATION%     \ROOT\WEBFOLDERS\APPSET\APPLICATION\DATAMANAGER\TRANSFORMATIONFILES\Load BW data.xls     
    %CLEARDATA%     0     
    %RUNLOGIC%     0     
    %CHECKLCK%     0
    EHP 1 for SAP NetWeaver 7.0
    CPMBPC 7.5 LEVEL 8
    Thanks
    Thanks

    Hi,
    Create a package link .While defining the package link you can make cube name as set prompt.
    Sorry...my understanding is wrong!
    Hope it helps...
    regards,
    Raju
    Edited by: VaraPrasadraju Potturi on Nov 11, 2011 9:06 AM

  • Sort all the Records in Data manager based on Update Date

    How to sort the reocrds in Data Manager based on Date or something ?
    And also I have three records for a same customer in ( for three different company codes ) and when I search for this customer with the Customer number then it is showing three records , but when I search with Update Date then no records are fetched ...what is the reason ?

    If you make a field as type "Time Stamp" in your data model using the MDM Console, make sure that the option Sort Index is set to Normal.  If this is the case, whenever a record is updated in the repository, the date will be updated.  If the sort index is normal, then MDM will allow you to sort the records based on the date and time by which they were updated.  You can do this by finding your time stamp field in the MDM data manager and clicking on it.  If there is an up and down arrow next to the name of your field, it means it can be sorted.  Also, to answer your other question, you can definitely search based on date.  Use the Free-Form Search on the left hand side of the data manager at the bottom of the screen.  Simply select a date, and it will show you all the records updated on that date.

  • Clear Data Manager Package Error "The data file is empty."

    Hi,
    When I run the Clear data package in Data Manager, I receive the error "The data file is empty." I selected a very specific set of dimension values (none are calculated) and am on BPC 7.5 SP3. I subsequently turned on debugging to troubleshoot, but do not see any obvious issues leading the the error message. The log file with debugging turned on is below. Any help would be greatly appreciated!
    Thanks.
    Tom
    TOTAL STEPS  3
    1. Export_Zero:        completed  in 1 sec.
    2. Load Cube:          Failed  in 0 sec.
    3. Clear:              completed  in 0 sec.
    [Selection]
    ENABLETASK= Yes
    CHECKLCK= Yes
    (Member Selection)
    Category: ACTUAL
    Time: 2010.C_SEP
    Affiliate: az_swhd
    Account: Donor_DART_ID_1
    Functional: Benchmark_F
    Report: Cons
    Restriction: AnyRestricted
    [Messages]
    The data file is empty. Please check the data file and try again.
    [EvModifyScript Detail]
    12-28-2010  17:30:05 - Debug turned ON
    INFO(%TEMPFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, APPSET, ESMetrics)
    TASK(EXPORT_ZERO, APP, CONSOLIDATED)
    TASK(EXPORT_ZERO, USER, NESSGROUP\tbardwil)
    TASK(EXPORT_ZERO, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    TASK(EXPORT_ZERO, DATATRANSFERMODE, 2)
    TASK(LOAD CUBE, APPSET, ESMetrics)
    TASK(LOAD CUBE, APP, CONSOLIDATED)
    TASK(LOAD CUBE, USER, NESSGROUP\tbardwil)
    TASK(LOAD CUBE, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(LOAD CUBE, DATATRANSFERMODE, 4)
    TASK(LOAD CUBE, DMMCOPY, 0)
    TASK(LOAD CUBE, PKGTYPE, 0)
    TASK(LOAD CUBE, CHECKLCK, 1)
    TASK(CLEAR COMMENTS, APPSET, ESMetrics)
    TASK(CLEAR COMMENTS, APP, CONSOLIDATED)
    TASK(CLEAR COMMENTS, USER, NESSGROUP\tbardwil)
    TASK(CLEAR COMMENTS, DATATRANSFERMODE, 0)
    TASK(CLEAR COMMENTS, SELECTIONORFILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(CLEAR COMMENTS, ENABLETASK, 1)
    TASK(CLEAR COMMENTS, CHECKLCK, 1)
    INFO(%ENABLETASK%, 1)
    INFO(%CHECKLCK%, 1)
    INFO(%SELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%TOSELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%APPSET%, ESMetrics)
    INFO(%APP%, CONSOLIDATED)
    INFO(%CONVERSION_INSTRUCTIONS%, )
    INFO(%FACTCONVERSION_INSTRUCTIONS%, )
    INFO(%SELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\FROM_51_.TMP)
    INFO(%TOSELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\TO_51_.TMP)
    INFO(%DEFAULT_MEASURE%, PERIODIC)
    INFO(%MEASURES%, Periodic,QTD,YTD)
    INFO(%OLAPSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%SQLSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%APPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\)
    INFO(%DATAPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\DataFiles\)
    INFO(%DATAROOTPATH%, C:\BPC\Data\WebFolders\)
    INFO(%SELECTIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\SelectionFiles\)
    INFO(%CONVERSIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\ConversionFiles\)
    INFO(%TEMPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\)
    INFO(%LOGICPATH%, C:\BPC\Data\WebFolders\ESMetrics\Adminapp\CONSOLIDATED\)
    INFO(%TRANSFORMATIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\TransformationFiles\)
    INFO(%DIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])
    INFO(%FACTDIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID])
    INFO(%CATEGORY_DIM%, [Category])
    INFO(%TIME_DIM%, [Time])
    INFO(%ENTITY_DIM%, [Affiliate])
    INFO(%ACCOUNT_DIM%, [Account])
    INFO(%CURRENCY_DIM%, )
    INFO(%APP_LIST%, Consolidated,ES_INC,GrantMgmt,LegalApp,LRate,Ownership,Rate)
    INFO(%ACCOUNT_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_SET%, AZ_SWHD)
    INFO(%CATEGORY_SET%, ACTUAL)
    INFO(%FUNCTIONAL_SET%, BENCHMARK_F)
    INFO(%REPORT_SET%, CONS)
    INFO(%RESTRICTION_SET%, ANYRESTRICTED)
    INFO(%TIME_SET%, 2010.C_SEP)
    INFO(%ACCOUNT_TO_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_TO_SET%, AZ_SWHD)
    INFO(%CATEGORY_TO_SET%, ACTUAL)
    INFO(%FUNCTIONAL_TO_SET%, BENCHMARK_F)
    INFO(%REPORT_TO_SET%, CONS)
    INFO(%RESTRICTION_TO_SET%, ANYRESTRICTED)
    INFO(%TIME_TO_SET%, 2010.C_SEP)
    INFO(DATAMGRGLOBALBPU, )
    INFO(DATAMGRGLOBALCLIENTMACHINEID, ETSCWLT048794)
    INFO(DATAMGRGLOBALERROR, )
    INFO(DATAMGRGLOBALPACKAGEINFOR, )
    INFO(DATAMGRGLOBALPACKAGENAME, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\PackageFiles\System Files/Clear.dtsx)
    INFO(DATAMGRGLOBALSEQ, 51)
    INFO(DATAMGRGLOBALSITEID, )
    INFO(MODIFYSCRIPT, DEBUG(ON)<BR>PROMPT(SELECTINPUT,[CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'),,"SELECT THE MEMBERS TO CLEAR",[Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])<BR>PROMPT(RADIOBUTTON,1,"DO YOU WANT TO CLEAR COMMENTS ASSOCIATED WITH DATA REGIONS IN BPC?",1,{"YES","NO"},{"1","0"})<BR>PROMPT(RADIOBUTTON,1,"SELECT WHETHER TO CHECK WORK STATUS SETTINGS WHEN DELETING COMMENTS.",1,{"YES, DELETE COMMENTS WITH WORK STATUS SETTINGS","NO, DO NO DELETE COMMENTS WITH WORK STATUS SETTINGS"},{"1","0"})<BR>INFO(C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempfdla_51_.tmp)<BR>TASK(EXPORT_ZERO,APPSET,ESMetrics)<BR>TASK(EXPORT_ZERO,APP,CONSOLIDATED)<BR>TASK(EXPORT_ZERO,USER,NESSGROUP\tbardwil)<BR>TASK(EXPORT_ZERO,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(EXPORT_ZERO,SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR>TASK(EXPORT_ZERO,DATATRANSFERMODE,2)<BR>TASK(LOAD CUBE,APPSET,ESMetrics)<BR>TASK(LOAD CUBE,APP,CONSOLIDATED)<BR>TASK(LOAD CUBE,USER,NESSGROUP\tbardwil)<BR>TASK(LOAD CUBE,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(LOAD CUBE,DATATRANSFERMODE,4)<BR>TASK(LOAD CUBE,DMMCOPY,0)<BR>TASK(LOAD CUBE,PKGTYPE,0)<BR>TASK(LOAD CUBE,CHECKLCK,1)<BR>TASK(CLEAR COMMENTS,APPSET,ESMetrics)<BR>TASK(CLEAR COMMENTS,APP,CONSOLIDATED)<BR>TASK(CLEAR COMMENTS,USER,NESSGROUP\tbardwil)<BR>TASK(CLEAR COMMENTS,DATATRANSFERMODE,0)<BR>TASK(CLEAR COMMENTS,SELECTIONORFILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(CLEAR COMMENTS,ENABLETASK,1)<BR>TASK(CLEAR COMMENTS,CHECKLCK,1)<BR>BEGININFO(
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR><BR><BR><BR><BR><BR><BR>SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) AS ZEROTABLE  GROUP BY [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)<BR><BR><BR><BR>ENDINFO<BR><BR><BR>)
    Edited by: Tom Bardwil on Dec 28, 2010 5:20 PM

    You can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) and the release (5.1, 7.0, 7.5) of BPC which you are using.
    Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
    Thanks and best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

  • How to handle Multiple date formats for the same date field in SQL*Loader

    Dear All,
    I got a requirement where I need to get data from a text file and insert the same into oracle table.
    I am using SQL*Loader to populate the data from the text file into my table.
    The file has one field where I am expecting date date data in multiple formats, like dd/mon/yyyy, yyyy/dd/mon, yyyy/mon/dd, ,mm/dd/yyyy, mon/dd/yyyy.
    While using SQL*Loader, I can see Loading is failing for records where we have formats like yyyy/dd/mon, yyyy/mon/dd, mon/dd/yyyy.
    Is there any way in SQL*Loader where we can mention all these date formats so that this date data should go smoothly into the underlying date column in the table.
    Appreciate your response on this.
    Thanks,
    Madhu K.

    The point being made was, are you sure that you can uniquely identify a date format from the value you receieve? Are you sure that the data stored is only of a particular limited set of formats?
    e.g. if you had a value of '07/08/03' how do you know what format that is?
    It could be...
    7th August 2003 (thus assuming it's DD/MM/RR format)
    or
    8th July 2003 (thus assuming it's MM/DD/RR format)
    or
    3rd August 2007 (thus assuming it's RR/MM/DD format)
    or
    8th March 2007 (thus assuming it's RR/DD/MM format)
    or even more obscurely...
    3rd July 2008 (MM/RR/DD)
    or
    7th March 2008 (DD/RR/MM)
    Do you have any information to tell you what formats are valid that would allow you to be specific and know what date format is meant?
    This is a classic example of why dates should be stored on the database using DATE datatype and not VARCHAR2. It can lead to corruption of data, especially if the date can be entered in any format a user wishes.

  • Data Sources for the Master data of FI-CO,MM and SD from R3

    Hi Gurus ,
    Could you please tell me what are the main datasources of FI-CO,MM and SD needed from R/3 for the master data to be loaded in new implementation of BI .
    Could you please guide me step by step .
    thanks in advance ,
    Pratham

    Hi
    It varies from Client and project. But the Most common SD & MM are from LO **** pit.
    help.sap.com is the best source for this
    Hope it helps

  • Identity Management for UNIX (aka Windows Services for Unix) Adding 2012 DC to a prep'd 2003 domain.

    We have been successfully using Windows Services for Unix on a 2003 domain for passwd and group maps.
    I prep'd the domain to allow a 2012 R2 server to be added and then added the IdMU role/feature on this new 2012R2 DC. Now the passwd map is still OK but the group map now shows full usernames rather than short names.
    i.e. what DID show with "ypcat group" as ...
    "infra-shared::65550:gfer,jhug,shig", now shows as
    "infra-shared::65550:Garry Ferguson,Jason Hughes,Steve Higgins"
    and so is not usable. I have had to revert to local /etc/group files on all our unix machines!!
    Help/comments would be really appreciated!
    Garry Ferguson

    Hi Gaz Ferg,
    SFU 3.5 is used to installed on windows 2003 and windows XP. SFU 3.5 cannot used on Windows 2012, that makes customer cannot user NFS and user name Mapping services on Windows
    2012.  From windows 2003 R2, NFS is a build-in component in OS, we need to add Roles/Features to use NFS.
    1. What is change in 2012R2
    IDMU component, which was used to authenticate Linux users has been removed. Now a Windows server cannot play role of NIS Master server. 
    Passwords cannot sync to the Unix Machines. Maps can not sync between Windows and Unix computers.
    2. What has not change in 2012R2
    Following methods to authenticate and map a Unix user to Window user are available:-
    Active Directory
    Active Directory Lightweight Directory Services (AD LDS)
    Username Mapping Protocol store (MS-UNMP
    Local passwd and group files
    Unmapped UNIX Username Access (UUUA) (applies to Server for NFS using AUTH_SYS only)
    You can find more information about this here –
    http://blogs.technet.com/b/filecab/archive/2012/10/09/nfs-identity-mapping-in-windows-server-2012.aspx
    http://blogs.msdn.com/b/shan/archive/2006/12/13/sfu-sua-idmu-fun-with-names.aspx
    More information:
    Install Identity Management for UNIX Components
    http://technet.microsoft.com/en-us/library/cc731178.aspx
    I’m glad to be of help to you!
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Not getting proper date values for original GI Date

    Hi,
    I have to Calculate Original GI Date based on Original promise Date and transport Date(Original GI Date = Original promise Date-transport Date) and I am getting Original Promise date as"21112008" and transport date as"20112008". when I subtract in Update rules as formula as (Original promise Date-transport Date) getting Original GI DATE as"01000001" as it is giving wrong date values.please help me whether i am going right way or is there any thing else i have to select in Update rule level (formula) to get proper date values or how can i approach to get original GI date.
    it is QUITE Urgent...please help any one to solve the Issue.
    REGARDS
    VENKAT.

    Hi Venkatesh,
    Please search SDN forum before you post any questions. There are lots of threads available for this scenarios.
    Use formula variables in Bex in order to achieve your requirement. 
    Links below, may be helpful:
    Formula variable with replacement path on system date
    How to creae days count variable in SAP BI-BEx
    http://help.sap.com/saphelp_nw70/helpdata/en/f1/0a56a7e09411d2acb90000e829fbfe/frameset.htm
    Regards,
    Ravi Kanth.

  • What will be data type for the newly created Service Interface def.?

    hi @,
    I am using new Service interface for the start event with objective that any outside application will be able to trigger the BPM. I have created the same SI and its operation and Input params. Now when the request comes there is an exclusive choice gateway which based upon the input fields decide which path to take into consideration. Now I need to map the SI input to the Data object but which data type I need to use ideally it should be the data type defined in the Service Interface should be used but I am not able to locate the same in the data type folder .
    How can I map the input service interface data type to the Data object so that it is available in the next BPM steps?
    Thanks,

    Hi,
    Typically the used types in a new service interface are anonymous. Try making the used complex type a global one (Right-Click onto it -> Refactor -> Make Anonymous Type Global).
    Afterwards you could use the speed button around your your 'start event' and create a new 'data object' from there. Now the IDE automatically assigns the type of the service interface to your 'data object' and  performs the standard mapping between the 'start event' and the 'data object' in addition.
    Please also have a look at the documentation for further information:
    Accelerated Modeling with Speed Buttons
    http://help.sap.com/saphelp_nwce711/helpdata/en/16/52f063cac643d2917347aab86930ef/frameset.htm
    There is also an interesting blog entry dealing with data objects and their reuse:
    How to avoid modeling errors in Netweaver BPM? Part 3: Data flow in style
    /people/soeren.balko/blog/2009/02/03/how-to-avoid-modeling-errors-in-netweaver-bpm-part-3-data-flow-in-style
    Hope that helps,
    Martin

Maybe you are looking for

  • Ibook G3 600 Tiger

    I just erased the entire 20 Gig drive on an ibook that I just bought used. It shows only 18.5 available before I install Tiger. After the installation, it shows only 13.75Gigs available. Is this about the correct amount of space for Tiger or can I tr

  • How to configure log4j in oracle bpm

    we can put log4j.jar in obpm and can use that as a logger, But I dont know the concepts of log4j. Can anyone please help me know what all we need to do to configure this, what relations are there between log4j.jar, log4j.properties, where to keep the

  • Retro engineering from Java to produce UML diagrams

    Hi, What are the free tools available to produce UML diagrams from the Java code of an application ? I tried some with Eclipse, but i am not convinced with the results obtained. Are there good free tools for this kind of task ? Thanks

  • Foriegn exchange configuration in ECC 6

    Hi Experts, As per my knowledge i have done the following settings. 1.Define Accounting Principles - GAAP 2.Assign Accounting Principle to Ledger Groups- GAAP- L5 ( Ledger) 3. Creation valuation methods -OB59 4. Define Valuation areas. 5. maintain th

  • Interesting af:selectOrderShuttle

    Greetings all! I have an unique requirement where I need to have a table with checkbox enabled in the left portion of the af:selectOrderShuttle, whiere multiple rows can be selected ans shuttle in the right portion again in a table format. Jdev11g ne