Data Manager Packeage and Process chain si not working

Hi All,
I executed a data manager package which contain a process chain to revaluate the one of my Account dimension meneber Say  "Revenue". I am working on BPC NW 7.0
steps I followed:
1. I created a script logic file and created a custom process chain.
    process chain steps:
  a) Start variant
  b) Modify dynamically
  c) Run Logic
  d) Or and Clear BPC tables
2. This process chain was included in data manager package.
3. Data manager package was modyfied to include parameters and scipt logic file name.
4. executed data package
The issue is " when I execute Data manager Package" I dont get any error but when I View status I dont see any pachage running or completed. If I see Process chain, It is failing at first step of Modify dynamically..no clue?
Could you please let me know what could be a issue?
Cheers,
SAC

I encounter this problem, Do this Steps:
I. First,check if your process chain is existing in the Library.
II.If yes,follow the steps below:
1. Edata - organize Package - Modify your Package.
2.Check if you had the correct process chain.
3.IF yes, Click View package at its right side.
4.Expand the Task Folder and take note of the Task Name (e.g. ZBPC_PROT_RUN_LOGIC)
5.click Advance,Compare the task name that you noted in the syntax TASKS
(e.g. TASK(ZBPC_PROT_CF_RUN_LOGIC,SUSER,%USER%)
6. It should be the same.
Running package but not appearing any status happens when the system cannot find your process chain.
hope this helps,

Similar Messages

  • Debugging LCDS Data Management from Flexbuilder3 on Tomcat does not work

    I have a sample with Data Management and when i use it on my tomcat it works fine. But i can´t debug it from flexbuilder 3. When i start it from the flexbuilder i get the connection and then i crashed with "No registered fault handler or token responder - throwing an error for destination:"
    It is the same tomcat i use to run standalone.
    is it a problem with the context.xml?
    [SWF] /flexine/dataService_wt2.swf - 1.422.965 bytes after decompression
    'cds-consumer-sandwich-null' consumer set destination to 'sandwich'.
    Configuration for destination='sandwich':
    New DataService for destination: sandwich
    DataService.fill() called for destination: sandwich with args: []
    '4A853118-F24B-5FE2-BF01-FCDFF6C93161' producer set destination to 'sandwich'.
    Creating a new independent data store for destination: sandwich
    Adding data service: sandwich to the data store: null initialized: false
    Finished validating destination: sandwich loadOnDemand/paged associations: [] sub-types:
    data store: null is initialized
    'my-rtmp' channel endpoint set to rtmp://localhost:2038
    'my-rtmp' channel settings are:
    'ds-producer-sandwich' producer sending message '52AF7949-7B16-7C8E-3B87-FCDFF6F8D3A2'
    'my-rtmp' channel got connect attempt status. (Object)#0
      code = "NetConnection.Connect.Success"
      description = "Connection succeeded."
      details = (null)
      DSMessagingVersion = 1
      id = "E35ECC7A-710F-4D9B-A507-330FDD83B283"
      level = "status"
      objectEncoding = 3
    'my-rtmp' channel is connected.
    'my-rtmp' channel sending message:
    (mx.messaging.messages::CommandMessage)
      body=(Object)#0
      clientId=(null)
      correlationId=""
      destination="sandwich"
      headers=(Object)#0
      messageId="CDC66303-F24B-1E87-4103-FCDFF8211298"
      operation="client ping"
      timeToLive=0
      timestamp=0
    'my-rtmp' channel sending message:
    (mx.data.messages.DataMessage)
      messageId = '52AF7949-7B16-7C8E-3B87-FCDFF6F8D3A2'
      operation = fill
      destination = sandwich
      identity = (null)
      body = []
      headers = {}
    'ds-producer-sandwich' producer connected.
    'ds-producer-sandwich' producer acknowledge of 'CDC66303-F24B-1E87-4103-FCDFF8211298'.
    'ds-producer-sandwich' producer acknowledge of '52AF7949-7B16-7C8E-3B87-FCDFF6F8D3A2'.
    'ds-producer-sandwich' producer fault for '52AF7949-7B16-7C8E-3B87-FCDFF6F8D3A2'.
    Dispatching fault event for destination: sandwich
    No registered fault handler or token responder - throwing an error for destination: sandwich
    [RPC Fault faultString="There was an unhandled failure on the server. javax/transaction/SystemException" faultCode="Server.Processing" faultDetail="null"]
         at mx.data::ConcreteDataService/http://www.adobe.com/2006/flex/mx/internal::dispatchFaultEvent()[C:\depot\flex\branches\enterprise_corfu_b2\frameworks\projects\data\src\mx\data\ConcreteDataService.as:2401]
         at DataListRequestResponder/fault()[C:\depot\flex\branches\enterprise_corfu_b2\frameworks\projects\data\src\mx\data\ConcreteDataService.as:6970]
         at mx.rpc::AsyncRequest/fault()[E:\dev\3.0.x\frameworks\projects\rpc\src\mx\rpc\AsyncRequest.as:103]
         at NetConnectionMessageResponder/statusHandler()[E:\dev\3.0.x\frameworks\projects\rpc\src\mx\messaging\channels\NetConnectionChannel.as:523]
         at mx.messaging::MessageResponder/status()[E:\dev\3.0.x\frameworks\projects\rpc\src\mx\messaging\MessageResponder.as:222]

    Here's an update for future victims doing google searches :-)
    The issue appears to be related to compiz. When I'm doing basic 2D desktop with metacity everything simply rocks: the machine suspends/resumes with no issues.
    When compiz is enabled, there is about 10 second delay after resume, during which the screen is black and you better not touch the keyboard! If I sit and wait for 8-12 seconds, the dark screen goes away GDM login dialog shows up. But if I touch the keyboard or the touchpad, it will either freeze or become exceptionally slow.

  • Infopackage insert in process chain does not work

    Hi,
    We upgrade a system in support package 13 (we are in veresion 7) and since this upgrade i can not add an infopackage in process chains.
    When i try to insert an "execute infopackage" process, i click on the list of infopackage available and the systems says  :
    No data selected
    Of course i have info^package available.
    If you have any idea ?
    Thanks
    Cyril

    We had the same problem ( BI 7.0 - SP 13 ) but was resolved after having applied OSS note 1062704.
    Just apply that note - it should solve it.

  • Managed Clients and Time Machine Quota Not Working

    We operate a Mac OS X 10.6.4 Server with 10.6.4 Clients; the clients are all bound to the OpenDirectory and all the laptops should be backed up using TimeMachine Server. Therefore we created a computer group which contains all the client machine records of the laptops and defined managed TimeMachine preferences for this computer group:
    - the TimeMachine server URL: afp://server.domain.tld/TimeMachine/
    - „startup volume only“, „skip system files“ and „back up automatically“ are enabled.
    - and a backup limit of 50 GB is set.
    If I run „mcxquery“ on the laptops, the settings are displayed. And the TimeMachine backup works.
    But … but the size limit of 50 GB isn’t respected, all client images grow „infinitely“.
    $ mcxquery
    com.apple.MCX.TimeMachine
    AutoBackup laptops (Computer Group) always 1
    BackupAllVolumes laptops (Computer Group) always 0
    BackupDestURL laptops (Computer Group) always afp://server.domain.tld/TimeMachine/
    BackupSizeMB laptops (Computer Group) always 51200
    BackupSkipSys laptops (Computer Group) always 1
    What am I missing!?
    Thanks
    Alex

    I presume the bought a time machine means a time capsule.
    How did you migrate the Time Machine files?
    From where? A Time Capsule or external drive?
    It is difficult to get TM working with Yosemite.. since it doesn't work after the upgrade on the old TM backup.. it will not work on the migrated files either.
    You simply start a new backup and store the old backups for a few months until you are ready to dump them.
    The instructions for inheriting old backups is B5 and B6 here.
    http://pondini.org/TM/Troubleshooting.html
    However it is just unlikely to work.. TM in Yosemite is very different. Broken even.
    I also strongly recommend people to use Carbon Copy Cloner or some other 3rd party backup until Apple get the bugs fixed. And after several months.. they are still rampant.

  • Sap document link on process chain is not working

    Hi all,
    This link is not working
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045.
    Please give me the correct like for the sap document on process chain.
    Thanks,
    Madhu

    Hi Madhu,
    Go through the below wiki where you will get most information related to Process chains.
    https://wiki.sdn.sap.com/wiki/display/BI/Processchainscreationandmonitoring
    Process chain creation step by step
    /people/juergen.noe/blog/2008/01/11/process-chain-creation--step-by-step
    Process chain Scheduling
    /people/debjani.das/blog/2009/04/05/scheduling-process-chain
    New features in BI process chains
    /people/mallikarjuna.reddy7/blog/2007/02/08/new-features-in-bi-2004s-process-chains
    However, if you are looking for the process chain from a SAP BI perspective, then please check the below link:
    http://help.sap.com/saphelp_nw04/helpdata/EN/8f/c08b3baaa59649e10000000a11402f/content.htm
    Hope this links helps
    Regards
    KP

  • SMS_DISCOVERY_DATA_MANAGER Message ID 2636 and 620. Discovery Data Manager failed to process the discovery data record (DDR)

    Hi
    I'm seeing this critical error on my primary.
    SMS_DISCOVERY_DATA_MANAGER Message ID 2636 and 620. 
    Discovery data manager failed to process the discovery data record (DDR)"D:\Prog.....\inboxes\auth\ddm.box\userddrsonly\adu650dh.DDR", because it cannot update the data source.
    Where these ddr's are actually under ddm.box\userddrsonly\BAD_DDRS folder
    I see a ton of DDR files in that folder. Not sure if I can delete them, so I moved them to a temp folder. AD User discovery keeps generating them.
    Any help ?
    Thanks
    UK
    

    Check the ddm.log file for more information.
    My Blog: http://www.petervanderwoude.nl/
    Follow me on twitter: pvanderwoude

  • Process Chains is not running

    Hi All,
    I have created the Process Chains and Scheduled the Process chain on early morning 2.00AM. In STart variant, i have specified the time and date and saved the Process Chains. I have checked the Scheduled job in SM37 and found,BI_PROCESS_TRIGGER is scheduled. Today i have checked the Log view and found Chain has not executed.
    Can u pls help on this issue.
    Thanks,
    Jelina.

    Hi Jelina,
    In the scheduling options,check whethre you have selected direct scheduling or with meta chain. Make the desired one and also if you want it to repeat periodically, then Check the Periodic job checkbox and select the periodic vaules from the Period Values Tab.
    Hope this helps...

  • Need help in triggering the Data stream load  using process chain

    Hi Guru's
    is it possible to trigger a data stream load using process chain?
    Any help is highly appreciated.
    Thanks
    Indiran

    Hi Indiran and welcome aboard!
    Don't think this is possible. SAP BW & SAP SEM-BCS are rather independent systems. Though, BCS lives on top of BI-BW stack, it even may have master data different from those in BW.
    Process chains, AFAIK, is completely the BW's feature. Certainly, you may use PCc on BW side, loading ODS/DSO and cubes involved in BCS data model.
    The main con here is the lost transparency -- you don't control everything from the consolidation monitor.
    The pro side is also rather obvious for me. Since, very often there is a huge difference between data quality at the data source and in the BCS totals cube, I need to make a lot of data transformation. Not only some data calculations or cleaning, but also transformation of data model: key figure model -> account model. It's much more easier to do in BW, for me.
    I even call the ODS/cubes/routines involved in such transformation as intermediate layer, the layer between data source and SEM-BCS.
    And this layer lives rather independently from BCS.
    Hope this helps.

  • Differences Between Infopackage Groups and Process chains

    Hi All,
    Can anybody explain me what are the differences between Infopackage Groups and Process Chains ? And how process chains are more comfortable

    hi nagarjuna,
    An InfoPackage group is a collection of InfoPackages. In order to summarize data requests which logically belong together from a business point of view, and to therefore simplify the request, you can collect data requests (meaning the InfoPackages) into an InfoPackage Group. You can schedule each of these groups in the scheduler . With the data request the scheduler can access InfoPackage groups directly and thereby request more than one InfoPackage at a time, corresponding to the setting in the InfoPackage group. Thus, you can support InfoPackage groups with the serialization of your data requests.
    While a process chain is a sequence of processes that wait in the background for an event. Some of these processes trigger a separate event that can start other processes in turn. hit this link to know abt PCs:
    http://help.sap.com/saphelp_nw04/helpdata/en/8f/c08b3baaa59649e10000000a11402f/content.htm
    Process chains are more comfortable becoz itz not like infopackages where we have to schedule each pack individually. And in PCs, monitoring is very convenient since u see evrything in a single page.
    regards
    sham'm

  • How to find unassigned master data text objects in process chain in bi

    Hi
    Please let me know how to find the unassigned master data text objects in process chain.

    hi,
    actually if u want to find that , is ur Unassigned Infoobject in present in process chain or not, , that u can find out just by right
    click on your DTP (master data Txt datasource--> Master data text ), if this is present in Process chain, then on ryt click, the process chain option will be active. by clicking over that u can find the name in which this exists.
    thanks.

  • Infopackage in process chain is not scheduling

    Hi All
    One of infopackage in process chain is not scheduling and showing error message like "<b>Attributes are not yet maintained
    Entire chain now has status 'R'</b>     
    what do we mean about this
    can anyone will let me know in detail and how to correct this?
    Regards
    balaji

    Hi balaji,
    how big is your process chain ? I got a similar problem when I build a metachain with a lot of processes in the chain (more than 60). In this case, the error message is deceiving. If your process chain has similar numbers of processes, check OSS 942804 (Jobcount handling).
    Kind regards,
    Jürgen

  • Process Chain : 0PM_INIT_P01 - Not Shown in RSPC

    Dear Friends
    I have installed 0PM_INIT_P01 (Initialisation:MTTR / MTBR) Process chain.
    It installed successfully.
    But i could not able to see in RSPC.
    Can anybody tell me why it is...?
    Some where i found authorization object  S_RS_PC is needed for my user id. Is it so...?
    Than how i can map to my user id...?
    Points will be awarded for right answers
    Regards
    Raju Saravanan

    Friends
    Problem is, the installed process chain is not getting displayed in RSPC for further operations.
    Before that, I have installed the Process Chain through business content (0PM_INIT_P01), and it got successfully installed. I had seen in SM37.
    For testing purpose i have given the same name (0PM_INIT_P01) for another chain creation.
    But system telling the name is already available.
    My confusion, why it is not getting displayed in anywhere in RSPC.
    Regards
    Raju Saravanan

  • Process chain is not getting scheduled

    Hi all ,
    Process chain is not getting shceduled , i have saved , checked and wanted to activate but its going to a loop and its not getting activated . please suggest some solution
    Regards
    Akhilesh

    Hi Akhilesh,
    Can you check that your user has the necessary authorizations to schedule a job? Can you also check if there
    is an error message generated in sm21 or dump in st22? You say as well that the scheduling is looping, in sm50 what
    table/statement is the process looping over?  Can you also let us know your BW release and support package level.
    Best Regards,
    Des

  • Including an Event Data Change in a Process Chain

    There is a step called "_Execution with Data Change in InfoProvider_" in the process of including event data change in a process chain.
    How do we carryout this? should this be done in Bex or in process chain itself?

    Dear Eshwari,
    1) To get the 'Execution on data change' option, you need to have aprocess chain with the 'Event data
    change' process type which contains 'the infoprovider' (The infoprovider, over which the query is created
    which you are trying to broadcast) in the process variant of it.
    2) Further more, additional authorization is also needed in order for the user to see such scheduling options in broadcaster.
    Please assign this authorization object to the affected users.
    BEx Broadcasting Authorization to Schedule  S_RS_BCS
         Activity                       *
         Event ID in Broadcasting Frame *
         Event Type in Broadcasting Fra <== Here you should select*
         ID of a BI Reporting Object in *
         Type of BI Reporting Object in *
    3) Please also ensure the user has not only the Busines Explorer role but also Business Intelligence role in Portal,
    Regards,
    Arvind

  • Data Load Requirement through Process Chain

    Hello All,
    I have implemented an area through a custom data source and following is my data load requirement through Process Chain. Request you to kindly help me for the same.
    1. The design is an InfoCube and updated using a Transformation through the custom data source.
    2. For the entire year 2008 I want a single request to be loaded. So it gets loaded into the PSA and then into the Infocube through a (Delta) DTP.
    3. Now, I have created an InfoPackage (Full Update) with year as 2009 in the selection. Tht is I want tht hencforth the data load should be for the year 2009.
    4. Hence, the Infopackage will run to bring the 2009 data into PSA and I run the (Delta) DTP to update the same in the Cube.
    5. Now, what i require is everyday the InfoPackage (Full Update) with 2009 should run and bring data into PSA. and the same should be updated in the InfoCube after deleting the 2009 request already present keeping 2008 request loaded previously intact.
    I hope the above is nt confusing.
    Please let me know if i can elaborate the same.
    Thank you.
    Regards,
    Kunal Gandhi

    Hi,
    Please go through the links.
    http://help.sap.com/saphelp_nw04/Helpdata/EN/21/15843b74f7be0fe10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/f8/e5603801be792de10000009b38f842/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/b0/078f3b0e8d4762e10000000a11402f/frameset.htm
    These may help you in designing the process chain as required.
    Regards,
    Sunil
    Edited by: sunil kumar on Apr 17, 2009 6:20 PM

Maybe you are looking for