Session in workflow

Hi!
I know with ":display.session" in workflows doesn't work. But with <invoke class='getLighthouseContext'><ref>WS_CONTEXT</ref></invoke> only I can get an InternalSession, but I need the "LocalSession", How can I get this in a workflow?
Thanks.

try with this
<Rule name='getAdminSession'>
<invoke name='getSession' class='com.waveset.session.SessionFactory'>
<invoke name='getSubject'>
<new class='com.waveset.server.InternalSession'/>
<s>#ID#Configurator</s>
</invoke>
<s>Admin</s>
<s>Administrator Interface</s>
</invoke>
</Rule>

Similar Messages

  • How to call a object in Session in workflow?

    In order to make my application work efficiently, i am going to put an class
    instance in Session. However, how can i call this object in a workflow
    e.g. in a business operation )?
    Or does anybody can tell me whether there is another way to maintain an
    instance through several workflows? By using variables, we simply can
    maintain an instance (e.g. EJB or javabean ) in a single workflow. Am i
    right?
    Thanks a lot.
    * Name: Gary Wang
    * Tele: 010-65546668-8119
    * Mail: [email protected]

    Is your "object" and "method" in a "package" Java? Loaded in the database or outside? (ie is it a java stored procedure? )
    null

  • Get JCR Session in workflow in  CQ 5.5

    HI
    I am trying to get javax.jcr.Session in my workflow
    public class GroupCreatorWorkflow implements WorkflowProcess{
            private static final Logger log =
    LoggerFactory.getLogger(GroupCreatorWorkflow.class);
            private ResourceResolverFactory resourceResolverFactory;
            public void execute(WorkItem item, WorkflowSession wsession,
    MetaDataMap metadata)
                             throws WorkflowException {
    try{
            ResourceResolver resolver;
         resolver =
    resourceResolverFactory.getAdministrativeResourceResolver(null);
         Session session = resolver.adaptTo(javax.jcr.Session.class);
    }catch (Exception e) {
        log.info("Exceptions @@@@@@@");
             log.info(e.getMessage());
    Log :
    /content/demo/default] com.demo.wcm.GroupCreatorWorkflow Exceptions
    /content/demo/default] com.demo.wcm.GroupCreatorWorkflow null
    Please let me know, if i am following wrong to get JCRSession.
    Thanks
    DR Reddy

    The execute method has WorkflowSession.  So wsession.getSession() should give you JCR session. More api details at
    http://dev.day.com/docs/en/cq/current/javadoc/com/day/cq/workflow/WorkflowSession.html

  • We have 2 session in workflow if 1st session got failed-still 2nd session has to execute and workflow should go to suspended mode.How can we achieve this?

    Thanks Veeru In this case workflow will go to Failed state but I want it in Suspended mode. Can you please help over here/

    HI Mounika, At session level you have option fail parent if task fails. Failing  Parent Workflow or Worklet #You can choose  to fail the  workflow or worklet if a task fails or does not run. The workflow or worklet that  contains the task instance is called the parent. A task might not run when the input condition for  the task evaluates to False.#To fail the parent workflow or  worklet if the task fails, double-click the task and select the Fail Parent If This Task  Fails option in the  General tab. When you select this option and a task fails, it does not  prevent the other tasks in the workflow or worklet from running. Instead, the  Integration Service marks the status of the workflow or worklet as failed. If you have a  session nested within multiple worklets, you must select the Fail Parent If This Task  Fails option for  each worklet instance to see the failure at the workflow level. #To fail the parent workflow or  worklet if the task does not run, double-click the task and select the Fail Parent If This Task  Does Not Run option in the General tab. When you choose this option, the  Integration Service fails the parent workflow if a task did not run.  This will help you. Regards,Veeru

  • How to loop a session in workflow?

    Hi All, I am imported sourse from oracle db actually it refresh data every day between 3 to 4 pm. I want to start session after my source get refreshed immediately. So i want to trigger workflow immediatly after my source got refreshed. can any one of you please help me. Thank You,Mahesh.

    Hi All, I am imported sourse from oracle db actually it refresh data every day between 3 to 4 pm. I want to start session after my source get refreshed immediately. So i want to trigger workflow immediatly after my source got refreshed. can any one of you please help me. Thank You,Mahesh.

  • Call stored procedure from pre-session in workflow

    Hi Friends How we can call a stored procedure from pre-session in session property, The procedure is presented under one package. i used follwing way call <schema_name>.<package_name>.<procedure_name> but its not calling that procedure. could you anybody help me with this?

    Hi Friends How we can call a stored procedure from pre-session in session property, The procedure is presented under one package. i used follwing way call <schema_name>.<package_name>.<procedure_name> but its not calling the session. could you anybody help me with this?

  • When start workflow it error out

    hi,
    i m create wokflow, then saved it, and start the workflow, but workflow failed. when i m going with session log it errorout
    "Unable to Fetch log.
    the log services has no record of requested session or workflow run. \n"
    how to resolve this problem.
    Regards
    Prashant

    Hi,
    Check the task properties by double clicking on the session > Properties and check whether Session Log File Directory property is set to $PMSessionLogDir\ and check whether you have given log file name in Session Log File Name property and if you haven't given the log file name then specify the file name as xyz.txt or with taskname.txt
    Once you are done with the above step save the session and re-run the workflow and check for the session log.
    Thanks,
    Navin Kumar Bolla

  • Create and Populate a Hyperion Planning Cube using Hyperion Data Integratio

    Friends,
    I am new to Essbase and have worked extensively in Informatica. Hyperion DIM (OEM version of Informatica) is chosen to create and populate a Hyperion Planning System (with Essbase cube in the backend).
    I am using Hyperion DIM 9.3.
    Can someone let me know (or share a document) how I can do the following
    1) Create a Planning application with a Essbase Cube in the backend using Hyperion Data Integration Management
    2) How to populate the Essbase outline and the actuals Essbase cube with data using DIM.
    Thanks a lot for all help.

    Hi,
    You cannot create planning applications using DIM.
    To load metadata have a look at :- http://www.oracle.com/technology/obe/hyp_fp/DIM_Planning/OBE_Dim_Planning.html
    You can refresh planning database in DIM by
    To enable the Refresh Database property for a session:
    In Workflow Manager, right-click the session and select Edit.
    Click the Mapping tab.
    Select a Planning target.
    Check the Refresh Database box.
    Ok?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • BODI - Job Error " blank message,possibly due to insufficient memory

    Ending up with below error while executing a DI job. The job uses a file of size 37 MB as source. As the error looks something related to memory. I have tried splitting up the input file into and executed the job. The job completed successfully without any error.
    Could someone help me out to find any memory setting which needs to be investigated in the dataflow level to get a permanent solution for it.
    Expecting for help !!!
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050406: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF|Dataflow SM_DM_ACCESS_LOG_DTL_F_DF
    Data flow <SM_DM_ACCESS_LOG_DTL_F_DF> received a bad system message. Message text from the child process is <blank message, possibly due to insufficient memory>. The process executing data flow <SM_DM_ACCESS_LOG_DTL_F_DF> has died abnormally. For NT,
    please check errorlog.txt. For HPUX, please check stack_trace.txt. Please notify Customer Support.
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050409: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF
    The job process could not communicate with the data flow <SM_DM_ACCESS_LOG_DTL_F_DF> process. For details, see previously
    logged error <50406>.
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050409: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF
    The job process could not communicate with the data flow <SM_DM_ACCESS_LOG_DTL_F_DF> process. For details, see previously
    logged error <50406>.

    Hi,
    loading a 37MB file shouldnt be a problem without splitting it. i´ve loaded GB size flatfiles without problems.
    Did you checked the error.txt as stated in the message? Whats in there.
    If you split the file and you can load it, you have enough space in your DB.
    Please check the memory utilization of your server during executing the job with one file. Maybe the Server is too busy...what would be strange with a 37MB file.
    Regards
    -Seb.

  • Invalid Column name DI BW load

    Hi All,
    I am running a load from Access DB to BW ODS. Is it a rul that the source table column names should match with the SAP BW target tables/infosource??
    Acces EMP Table - Source
    =====================
    ID
    FIRSTNAME
    LASTNAME
    BW ODS structure TARGET
    ===============
    EMPID
    0BP_FSTNAME
    0BP_LSTNAME
    0RECORDMODE
    This is the err log
    Job name: Job_Load_EMP_to_BW
    (12.1) 08-29-09 14:56:24 (E) ( VAL-030160: |SESSION Job_Load_EMP_to_BW|WORKFLOW WF_Extract_Data|DATAFLOW DF_EMP_Data|STATEMENT <GUID::'8237cee8-f030-41e0-9106-7adbca5baf7a' LOAD TABLE BW_TARGET_DS."".ZEMP_HC INPUT(Query)>
                                                         Column <ID> in query <Query> must have the same name as column </BIC/EMPID> in target table <ZEMP_HC>. Please change the name
                                                         of the column or add a new column to the target table.
    (12.1) 08-29-09 14:56:24 (E) () VAL-030160: |SESSION Job_Load_EMP_to_BW|WORKFLOW WF_Extract_Data|DATAFLOW DF_EMP_Data|STATEMENT <GUID::'8237cee8-f030-41e0-9106-7adbca5baf7a' LOAD TABLE BW_TARGET_DS."".ZEMP_HC INPUT(Query)>
                                                         Column <ID> in query <Query> must have the same name as column </BIC/EMPID> in target table <ZEMP_HC>. Please change the name                                                      of the column or add a new column to the target table.
    Thanking You.
    Rao

    Hi Manoj.
    I renamed the output cols.
    when I refresh the Infosource on the DI side, I wont refresh itself, is there a trick behind it.
    Datatype conversions for Access to BW, do I do any manual stuff or the query transform will mange it as I coudl see some yellow/warrnin msgs regd those but it worked.
    Thanks
    Rao

  • Duns & Bradstreet Implementation for Vendor Analytics

    Hi All,
    Our client wants to go for D&B implemantation. We have provided the Vendor Master details extract to D&B and they have responded with the files (CSV) that needs to be uploaded to BI for Vendor Analytics.
    We have not yet installed D&B BI content objects yet (DUNS number InfoObject and 0Vendor needs to be activated). As we have to reactivate 0Vendor and we already have some customization on the same (0Vendor_attr datasource has been enhanced), what all precautions need to be taken.
    If someone has installed the D&B, any inputs/suggesstions will be really helpful. Thanks.
    Regards,
    Kavitha

    Have you tried running the sql against your source? I ran this against one of our EBS instances and it runs fine.
    If it runs OK I would next check what connection the Informatica workflow is using. Check the full session and workflow logs to make sure its connecting to the correct database.

  • ASUG Presentation - April 2007 - Do you have the ABAP Classes presentation?

    Hi all,
    does anyone have a copy of the ASUG presentation that supported the session:
    "Classy Workflows: Moving from Business Objects to ABAP Classes - practical examples, and case studies, for using ABAP Classes in Workflow applications".
    Can you please send it to [email protected]
    Much thanks and regards,
    Cristiana

    Hi there
    In Captivate, try opening the preferences by pressing Shift+F8 and navigating to the Project > Start and End and clearing the Auto Play option. Re-publish and re-insert into the PowerPoint and see if that helps.
    Cheers... Rick
    Helpful and Handy Links
    Begin learning Captivate 5 moments from now! $29.95
    Captivate Wish Form/Bug Reporting Form
    Adobe Certified Captivate Training
    SorcererStone Blog
    Captivate eBooks

  • How to generate dynamic param file

    Hi, Can anyone please tell me how to generate dynamic parameter file from a table. Table has following structure: create table parameter_temp{Folder_name ,Workflow_name,session_name,parameter_name,parameter value} This table stores all the parameters for all the workflows. How can I generate parameter file for specific session or workflow. Thanks,

    Hi All, I have a scenario to implement.Need urgent help. I have relational source which has below fields. ID,Account,AccountType,Balance1,1001,GL,46567.901,1002,SL,56889.971,1003,Equity,45543.9081,1004,GL,89.541,1005,SL,-56889.971,1006,Equity,-45543.9081,1007,SL,-42777.45  Here my first requiremnt is to check if the balance value for entire file is 0 and if balance amount for each AccoutType is 0 , if both condtionn satifies the flow will go , else load will fail. I tried in below approach. SQ >> Expression >> Aggregator  In aggregaor i have first calculated sum(balance) for entire file by group by on ID column in an aggregator , and keep actual data in expression transformation. Then i took an expression to connect actual data with the sum(balance) , since i need to perform further calculation.I tried connecting expression and aggregator , but it is not allowing me to connect the posrts. And if i am using a joiner , then wrong data is loading to target. It is joining each result of aggregator with actual ports in expression.   I am not sure how to handle this scenario, so that we can just calculate sum of entire file and sum on basis of account type , is both sum are 0 then load the target , else fail.

  • What are the log files created while running a mapping?

    Hi Sasi , I have a doubt who actually generates the " Log Events ". Is it the Log Manager or Application services ?

    Hi Anitha, The Integration service  will be generate two logs when the mapping runs 1) Session log  -- Has the details of the task ,session errors and load statistics..2) Workflow log -- Has the details of the workflow processing, and workflow errors..  The workflow log will be generated when the workflow started and the session log will be generated once the session intiated.For more detail Please refer the infa help docs... Normally the services will generate log  ex: IS and RS will log their activity...  The below process will happen the when the workflow inititated..[Copied fropm infa help docs] 1.#The Integration Service writes binary log files on the node. It sends  information about the sessions and workflows to the Log Manager.2.#The Log Manager stores information about workflow and session logs in the  domain configuration database. The domain configuration database stores  information such as the path to the log file location, the node that contains  the log, and the Integration Service that created the log.3.#When you view a session or workflow in the Log Events window, the Log Manager  retrieves the information from the domain configuration database to determine  the location of the session or workflow logs.4.#The Log Manager dispatches a Log Agent to retrieve the log events on each  node to display in the Log Events window.    ThanksSasiramesh

  • DI on Solaris erratic behavior

    While executing jobs on Solaris, various dataflows are failing inconsistenly with a similar message to the one below
    (11.7) 11-12-08 15:13:40 (E) (17833:0001) RUN-050406: |Session JOB_NOMIS_MIS_Load_D|Workflow C_WF_OFFENDER_MOVEMENTS|Workflow WF_FACT_MOVEMENT_EXTERNAL|Dataflow DF_FACT_MOVEMENT_EXTERNAL_SUMMARY_1_MIS Data flow DF_FACT_MOVEMENT_EXTERNAL_SUMMARY_1_MIS> received a bad system message. Message text from the child process is <blank message, possibly due to insufficient memory>. The process executing data flow <DF_FACT_MOVEMENT_EXTERNAL_SUMMARY_1_MIS> has died abnormally. For NT, please check errorlog.txt. For HPUX, please check stack_trace.txt. Please notify Customer Support.
    (11.7) 11-12-08 15:13:40 (E) (17833:0001) RUN-050409: |Session JOB_NOMIS_MIS_Load_D|Workflow C_WF_OFFENDER_MOVEMENTS|Workflow WF_FACT_MOVEMENT_EXTERNAL The job process could not communicate with the data flow <DF_FACT_MOVEMENT_EXTERNAL_SUMMARY_1_MIS> process. For details, see previously logged error <50406>.
    (11.7) 11-12-08 15:13:40 (E) (17833:0001) RUN-050409: |Session JOB_NOMIS_MIS_Load_D|Workflow C_WF_OFFENDER_MOVEMENTS|Workflow WF_FACT_MOVEMENT_EXTERNAL The job process could not communicate with the data flow <DF_FACT_MOVEMENT_EXTERNAL_SUMMARY_1_MIS> process. For details, see previously logged error <50406>.
    The failure is not consistent, even though the same code executes with the same data.  I've been told the job server has plenty of available memory and the failing dataflows do not have a large memory footprint.  As I'm not a UNIX person, I don't know where to start looking. 
    Any help pointing us at where to look (with the configuration) would be appreciated.
    Thanks
    Michael

    $ ulimit -a
    core file size        (blocks, -c) unlimited
    data seg size         (kbytes, -d) unlimited
    file size             (blocks, -f) unlimited
    open files                    (-n) 256
    pipe size          (512 bytes, -p) 10
    stack size            (kbytes, -s) 8192
    cpu time             (seconds, -t) unlimited
    max user processes            (-u) 29995
    virtual memory        (kbytes, -v) unlimited
    env4-bodimis1:/u01/app/bobj/product/11.7.3.0/di_1/log
    One dataflow has a query and table_comparision and the other has lots of queries, a case, and a validation transform.
    I've been told its always the same data being processed (as its being release tested at the moment).
    Michael

Maybe you are looking for