Maxl to clear partial data in ASO applications

Hi ,
Is there a maxl statement to clear partial data from ASO applications?
Thanks
Kannan.

Hi,
Another solution, close to what garycris suggested, is to create a report script to export the subset of data you want to delete to a text file. Then load this text file using a load rule that replaces your data column by a missing column.
You can then call the report script and the load rule using a calc script.
Problem is you need to also have a BSO database to do that. If it's the case then your good.
If not, you can probably do that from a maxl script.
Good practice would be to have a CLRASO load rule for all you clear process, and then make sure that your different clear report script fit its format. You end up with n clear report scripts, n clear calc scripts or maxl scripts, and 1 clear load rule.
Cyril
Edited by: user635693 on 12 janv. 2009 23:25

Similar Messages

  • Clear Partial Data in an Essbase Aggregate storage database

    Can anyone let me know how to clear partial data from an Aggregate storage database in Essbase v 11.1.13? We are trying to clear some data in our dbase and don’t want to clear out all the data. I am aware that in Version 11 Essbase it will allow for a partial clear if we write using mdx commands.
    Can you please help me on the same by giving us some examples n the same?
    Thanks!

    John, I clearly get the difference between two. What I am asking is in the EAS tool itself for v 11.1.1.3 we have option - right clicking on the DB and getting option "Clear" and in turn sub options like "All Data", "All aggregations" and "Partial Data".
    I want to know more on this option part. How will this option know which partial data to be removed or will this option ask us to write some MAXL query for the same"?

  • Please give the maxl scripts for export data in ASO

    i think we have to use some report file ,please give the report file script and give the full statement of maxl to export data

    There's a couple of ways to do this.
    What version of Essbase are you using?
    Do you want all the data exported or only a subset?
    Brian Chow

  • Clear data in ASO - Dynamic Members

    Folks,
    I've been trying to clear partial data in my ASO cube but encountered this error -
    ERROR 1013358 - Dynamic members are not allowed in data clear region specification
    There are few dimensions which are tagged as Dynamic and few members have formulas on them. Is it possible to exclude those members in the Maxl script or any better way of doing it?
    I read a post where Glenn mentioned he used an exclude clause but I'm not familiar with the usage.
    Thank you!
    gugler

    Why do you want to clear dynamic members? Clear out the members that you dynamic member is based off.

  • How can i clear the portion of data and load the new data in ASO App 9.3.1?

    Hi
    I want to delete the portion of data and i want to reload the new portion of that data In ASO Application 9.3.1 Version.
    Please give anyone with good procedure.
    Thanks
    Kranthi

    Have a read of :- Clear Data by Fix
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Partial data clear in ASO possible for multiple tuples?

    Hi,
    I am trying to do a partial data clear in an ASO cube. I need to clear FY12->Oct & FY13->Nov (consecutive periods). Here's what I tried:
    +alter database 'GL_TXT'.'GL_TXT' clear data in region '{([FY12],[Oct]),([FY13],[Nov])}' physical;+
    No error is thrown but the data isn't cleared either. The statement finishes execution almost immediately.
    I tried the UNION function but that didn't work either. Here's how my statement looks with the UNION function:
    +alter database 'GL_TXT'.'GL_TXT' clear data in region '{UNION({([FY12],[Oct])},{([FY13],[Nov])})}' physical;+
    Again, no error but no clear either. The UNION pulls the correct data set when used in a Select statement:
    +SELECT UNION({([FY12],[Oct])},{([FY13],[Nov])}) ON COLUMNS FROM GL_TXT.GL_TXT;+
    I can get it to clear if I write separate statements for each period but I want to have them in a single script as I suspect two scripts wouldn't be very efficient.
    Please help!
    Thanks,
    Shashi

    Thanks for your reply Vasavya! Running the region clear scripts twice (once for each month) is still faster for me than using the report script approach. I want to see if having both periods in one statement will improve the performance :)
    Regards,
    Shashi

  • How to export level0 data in COLUMNAR form from an ESSBASE ASO application?

    Hello,
    I am in the need to export the level0 data from as 11.1.1.3 Essbase ASO application in COLUMNAR form.
    please note that the size of the db is 6GB and report script failed.
    Is it possible? please sugesst a way.
    thanks,
    Ankit

    y dont you use report script
    few tips to for build report script when you have huge data
    1. Decrease the amount of Dynamic Calcs in your outline. If you have to, make it dynamic calc and store.
    2. Use the <Sparse command at the beginning of the report script.
    3. Use the <Column command for the dense dimensions instead of using the Page command. The order of the dense dimensions in the Column command should
    be the same as the order of the dense dimension in the outline. (Ex. <Column (D1, D2)).
    4. Use the <Row command for the sparse dimensions. The order of the sparse dimensions in the Row command should be in the opposite order of the sparse
    dimension in the outline. (Ex. <Row (S3, S2, S1)). This is commonly called sparse bottom up method.
    5. If the user does not want to use the <Column command for the dense dimensions, then the dense dimensions should be placed at the end of the <Row command.
    (Ex. <Row (S3, S2, S1, D1, D2)).
    6. Do not use the Page command, use the Column command instead.
    this will work trust me

  • Clear the data region using Maxl

    Hi,
    I am trying to clear the data region using Maxl query, I am getting error like: Dynamic members not allowed in data clear region specificaion.
    +alter database 'PGSASO'.'ASOPGS' clear data in region 'CrossJoin(CrossJoin({[Actual]},{[FY11]}),{[Inputs].Levels( 0 ).Members})';+
    (Inputs( not a dynamic member, parent of this meber is a dynamic) comes under Account dim).
    I have tried the below code also(Syntax error in input MDX queryon line 1 at token ',').
    alter database 'PGSASO'.'ASOPGS' clear data in region 'CrossJoin([Inputs].Levels(0), CrossJoin({[ACT]},{[FY11]}))';
    Can anyone do let me know if there is any syntax error or alternate function would be available for clearing level-0 members/descendants under Dynamic parent...
    Thanks,
    Bharathi

    There is no syntax error in your statement. The problem is you input dimension(I guess) has some members with formulas on them and the clear statement does not allow those members to be in the clear. Ther ware a couple of ways around it. seperate out the members with formulas under a different parent and then modify the crossjoin to only pick up the level zero members that are not in that parent or you could put a UDA on formula members and use the exclude command to exclude anything with that UDA

  • Exporting data in ASO

    I have a ASO cube with 4 GB of data. I am exporting data through MAXL as Export.txt. As my database is large, analytic services creates two files.
    Export.txt and Export_01.txt.
    I am clearing all data from the cube and trying to load the data from the exported files Export.txt and Export_01.txt. using Maxl statements.
    Now the problem is the first file gets loaded without any error and the second file gives me error. When I opened these files in rules file editor, first fiile has all the dimensions and data, while the second file has just two dimensions and data.
    Can anybody guide me how to manage this, because as I have a ASO cube whose one dimension is frequently updated and therefore I have to take the back up for data and reload it .
    Thanks,

    Hi,
    I have some question related to your problem.
    Are you using same platform across the operation (export to import). Kindly give detail of your environment.
    What was the error you are facing?
    Rule file editor has limitation to show limited row while opening any data file so how confirm you are saying second file has two dimension and data. Kindly open this file in some text editor and check it.
    Thanks
    Dhanjit G.

  • Can we upload data in ASO Cube version 11.1.2

    Hi,
    I have a requirement where in ASO cube users need the ability to enter their plan data into the ASO application. Do we need to create Transparent Partition for that. I have an ASO and Hyperion Planning cube and in my Planning cube I have more number of dimensions as compare to ASO.
    Please advise
    Thanks in advance.

    If you are using the Excel add-in you do NOT do a lock, just a send. The lock is a BSO concept to put a hold on blocks. ASO has no blocks to lock. From smartview you do a submit and it handles it all for you. The beauty of using ASO is exactly what you mention, results are immediate without a calc.

  • ABAP Code for Backup the entire table data in the application server

    Hello Friends,
    I have to create the table data Backup and Store the entire table data in the application server and also be able to restore the data back if needed.
    this should be dynamic program for any table based on the table name given on the application server.. I have developed a program for this but its having problems with the Quantity, amount. Its not writing it correctly at the application level.
    ANy Suggestions on this.
    Below is the program for this.
    Thanks,
    Ster.
    * Report  YWMM_TABLE_DUMP                                             *
    REPORT ywmm_table_dump .
    TABLES :
            dd03l.
    * Type spool declaration
    TYPE-POOLS:
            abap, slis.
    DATA: i_table_data1  TYPE REF TO data.
    DATA : it_dd03l LIKE dd03l OCCURS 0 WITH HEADER LINE.
    *DATA : gt_fieldcat TYPE lvc_s_fcat.
    DATA : i_fcat      TYPE STANDARD TABLE OF lvc_s_fcat,
           l_dr_line         TYPE   REF TO data,
           l_v_as4vers       TYPE as4vers.
    FIELD-SYMBOLS: <f_table_data1>     TYPE STANDARD TABLE,
                   <f_wa_table_data1>  TYPE ANY.
    SELECTION-SCREEN: BEGIN OF BLOCK bl1 WITH FRAME TITLE text-001.
    PARAMETERS: rb_copy RADIOBUTTON GROUP map DEFAULT 'X',
                rb_rest RADIOBUTTON GROUP map.
    SELECTION-SCREEN: END   OF BLOCK bl1.
    SELECTION-SCREEN: BEGIN OF BLOCK bl2 WITH FRAME TITLE text-002.
    PARAMETERS: p_table  TYPE tabname OBLIGATORY,
                p_plfld TYPE dd03l-fieldname.
    SELECTION-SCREEN SKIP 1.
    PARAMETERS: p_bkfile TYPE localfile OBLIGATORY.
    SELECTION-SCREEN: END   OF BLOCK bl2.
    PERFORM get_data.
    IF rb_copy = 'X'.
      PERFORM backup.
    ELSEIF rb_rest = 'X'.
      PERFORM database_update.
    ENDIF.
    *&      Form  get_data
    FORM get_data.
      CLEAR   i_fcat.
      REFRESH i_fcat.
      CALL FUNCTION 'LVC_FIELDCATALOG_MERGE'
           EXPORTING
                i_structure_name = p_table  " Table Name
           CHANGING
                ct_fieldcat      = i_fcat
           EXCEPTIONS
                OTHERS           = 1.
      CALL METHOD cl_alv_table_create=>create_dynamic_table
      EXPORTING
        it_fieldcatalog = i_fcat
      IMPORTING
        ep_table        = i_table_data1.
      IF sy-subrc = 0.
        ASSIGN i_table_data1->* TO <f_table_data1>.
      ELSE.
        WRITE: 'Error creating internal table'.
      ENDIF.
      IF rb_copy = 'X'.
        SELECT  * FROM (p_table) INTO CORRESPONDING FIELDS OF
                  TABLE <f_table_data1> UP TO 20 ROWS.
      ELSEIF rb_rest = 'X'.
        CREATE DATA l_dr_line LIKE LINE OF <f_table_data1>.
        ASSIGN l_dr_line->* TO <f_wa_table_data1>.
    *Get Data from Application Server
    * Opening the dataset P_BKFILE given in the selection screen
        TRANSLATE p_bkfile TO LOWER CASE.
        OPEN DATASET p_bkfile FOR INPUT IN TEXT MODE." ENCODING DEFAULT.
        IF sy-subrc NE 0.
    *    MESSAGE:
        ELSE.
          DO.
    * Reading the file from application server
            READ DATASET p_bkfile INTO <f_wa_table_data1>.
            IF sy-subrc = 0.
              APPEND <f_wa_table_data1> TO <f_table_data1>.
            ELSE.
              EXIT.
            ENDIF.
          ENDDO.
    * Closing the dataset
          CLOSE DATASET p_bkfile.
        ENDIF.
      ENDIF.
    ENDFORM.                    " get_data
    *&      Form  backup
    *       text
    *  -->  p1        text
    *  <--  p2        text
    FORM backup.
      TRANSLATE p_bkfile TO LOWER CASE.
      OPEN DATASET p_bkfile FOR OUTPUT IN TEXT MODE.
      IF sy-subrc NE 0.
        WRITE: text-017.
        STOP.
      ELSE.
        LOOP AT <f_table_data1> ASSIGNING <f_wa_table_data1>.
          TRANSFER <f_wa_table_data1> TO p_bkfile.
        ENDLOOP.
      ENDIF.
      CLOSE DATASET p_bkfile.
    ENDFORM.                    " backup
    *&      Form  database_update
    FORM database_update.
      DATA : i_mara_u TYPE STANDARD TABLE OF mara WITH HEADER LINE,
             i_ekpo_u TYPE STANDARD TABLE OF ekpo WITH HEADER LINE,
             i_eban_u TYPE STANDARD TABLE OF eban WITH HEADER LINE,
             i_resb_u TYPE STANDARD TABLE OF resb WITH HEADER LINE,
             i_plpo_u TYPE STANDARD TABLE OF plpo WITH HEADER LINE,
             i_stpo_u TYPE STANDARD TABLE OF stpo WITH HEADER LINE,
             i_vbap_u TYPE STANDARD TABLE OF vbap WITH HEADER LINE,
             i_vbrp_u TYPE STANDARD TABLE OF vbrp WITH HEADER LINE,
             i_lips_u TYPE STANDARD TABLE OF lips WITH HEADER LINE,
             i_afvc_u TYPE STANDARD TABLE OF afvc WITH HEADER LINE,
             i_asmd_u TYPE STANDARD TABLE OF asmd WITH HEADER LINE,
    *           i_cooi_u TYPE STANDARD TABLE OF cooi WITH HEADER LINE,
             i_qmel_u TYPE STANDARD TABLE OF qmel WITH HEADER LINE,
             i_cooi_u TYPE STANDARD TABLE OF cooi WITH HEADER LINE,
             i_esll_u TYPE STANDARD TABLE OF esll WITH HEADER LINE,
             i_t165_u  TYPE STANDARD TABLE OF t165 WITH HEADER LINE,
             i_t165e_u TYPE STANDARD TABLE OF t165e WITH HEADER LINE,
             i_twpko_u TYPE STANDARD TABLE OF twpko WITH HEADER LINE,
             i_tpext_u TYPE STANDARD TABLE OF tpext WITH HEADER LINE,
             i_ce4mxpa_u TYPE STANDARD TABLE OF ce4mxpa WITH HEADER LINE,
             i_ce4mxpa_acct_u TYPE STANDARD TABLE OF ce4mxpa_acct WITH
                                                             HEADER LINE,
             i_zaim_u  TYPE STANDARD TABLE OF zaim WITH HEADER LINE,
             i_s012_d TYPE STANDARD TABLE OF s012 WITH HEADER LINE,
             i_s012_i TYPE STANDARD TABLE OF s012 WITH HEADER LINE,
             i_dummy  TYPE STANDARD TABLE OF mara.
      CASE p_table.
        WHEN 'MARA'.
    *     Non-Key
          PERFORM move_to_table USING   <f_table_data1>
                                CHANGING i_mara_u[]
                                         i_mara_u.
          PERFORM update_table USING i_mara_u[].
      ENDCASE.
    ENDFORM.                    " database_update
    *&      Form  move_to_mara
    FORM move_to_table USING    p_tab_from TYPE STANDARD TABLE
                       CHANGING p_tab_to   TYPE STANDARD TABLE
                                p_w_table.
      DATA:  l_wa_fcat TYPE lvc_s_fcat.
      FIELD-SYMBOLS: <f_field_from> TYPE ANY,
                     <f_field_to>   TYPE ANY.
      LOOP AT p_tab_from ASSIGNING <f_wa_table_data1>.
        LOOP AT i_fcat INTO l_wa_fcat.
          ASSIGN COMPONENT l_wa_fcat-fieldname
         OF STRUCTURE <f_wa_table_data1> TO <f_field_from>.
          ASSIGN COMPONENT l_wa_fcat-fieldname
         OF STRUCTURE p_w_table TO <f_field_to>.
          <f_field_to> = <f_field_from>.
        ENDLOOP.
        APPEND p_w_table TO p_tab_to.
      ENDLOOP.
    ENDFORM.                    " move_to_mara
    *&      Form  update_table
    FORM update_table  USING p_table_update TYPE STANDARD TABLE.
      SELECT SINGLE *
        FROM dd03l
       WHERE fieldname = p_plfld
         AND tabname   = p_table
         AND keyflag   <> 'X'
         AND as4local = 'A'
         AND   as4vers = l_v_as4vers
         AND   ( comptype = 'E' OR comptype = space ).
      IF sy-subrc = 0.
    *   Do update
        IF NOT p_table_update IS INITIAL.
          UPDATE (p_table) FROM TABLE p_table_update.
          IF sy-subrc = 0.
            COMMIT WORK.
          ELSE.
            ROLLBACK WORK.
            WRITE: text-003.
            STOP.
          ENDIF.
        ENDIF.
      ELSE.
    *delete and insert.
        IF NOT p_table_update IS INITIAL.
    *      DELETE (p_table).
          IF sy-subrc = 0.
            INSERT (p_table) FROM TABLE p_table_update.
            IF sy-subrc = 0.
              COMMIT WORK.
            ELSE.
              ROLLBACK WORK.
              WRITE: text-018.
              STOP.
            ENDIF.
          ELSE.
            ROLLBACK WORK.
            WRITE: text-018.
            STOP.
          ENDIF.
        ENDIF.
      ENDIF.
    ENDFORM.                    " update_table
    Edited by: Julius Bussche on Jul 18, 2008 1:43 PM
    Please use a meaningfull subject title!

    ARS,
    I am struggling a bit to get this.
    there is a syntax error,
    Field "FIELDS_INT-TYPE" is unknown. It is neither in one of thespecified tables nor defined by a "DATA" statement.     
    Again you have asked to move to a diffrent table. What is that table and how to build it.
        LOOP AT <f_table_data1> ASSIGNING <f_wa_table_data1>.
          LOOP AT i_fcat INTO l_fcat.
            IF l_fcat-inttype EQ 'P'.
              ASSIGN COMPONENT l_fcat-fieldname
                  OF STRUCTURE <f_wa_table_data1> TO <f_field>
                  TYPE     fields_int-type
                  DECIMALS fields_int-decimals.
            ELSE.
              ASSIGN COMPONENT l_fcat-fieldname
                  OF STRUCTURE <f_wa_table_data1> TO <f_field>
                  TYPE     fields_int-type.
            ENDIF.
            " Move <f_field> to a new table and use this table for download
          ENDLOOP.
          TRANSFER <f_wa_table_data1> TO p_bkfile.
        ENDLOOP.
    Ster

  • Clearing the data buffer from the input fields

    Hi,
    I am using an user exit CONFPP02 for the Tcode: co11n. I have written a program such that the confirmation numbers having the status CNF will not be allowed to be processed. The whole confirmation is terminated when the system checks the confirmation number and its status as CNF. If the status of the confirmation number is PCNF, the program allows the further processing of the Tcode Co11n.
    The problem starts when the user enters the PCNF status confirmation number and enters, the system stores the values in the various input fields. Now without exiting the initial confirmation screen, if the user replaces the Confirmation number of the PCNF status to the Confirmation number of the CNF status , the system issues a warning message:"  Confirmation no. or order/sequence/operation has been changed
    However, the input fields still contain default values from thepreceding confirmation". if the user says yes, the program written is bypassed , thus allowing the reconfirmation of the CNF confirmation number  into PCNF confirmation number .
    Can anybody suggest a suitable method to clear the data stored defultly in the input fields so that the program can be made to work.
    With regards,
    Avinash.S
    Mobile no:09996192456

    Hi Gilad,
    I never use Preview and did not use Preview at all before sending the document over to my most recent client either.
    I only opened Preview up this morning after I discovered that all of the form data had disappeared on his end! And the only reason I opened up Preview was because I already knew it looked fine in Acrobat so wanted so wanted to view the form in a different application.
    My client would not even have notified me about it had he not sent the signed form back to me and I saw there was no data there! I then called him and asked him about the missing data and he said that he thought I had simply sent over the form purposely with blank fields:(
    I have just emailed him to ask him what application he opened it up in. My understanding is that he is on Windows because ne mentioned to me his company was using Windows when we last spoke. Perhaps he has a personal Macbook where he opened it? I just don't know and can hopefully soon find out:)
    But THANK YOU for letting me know about that script! I appreciate it:) I have now downloaded and installed it and will just have to use it on ALL of my forms before sending them out:)

  • Question regarding ASO application in Essbase 11 version

    Hi All,
    Thanks for the replies to my previous posts.
    I have a question regarding the ASO applications for telecom company built in Essbase 11 version. Please provide your feedback on the design.
    The ASO application has the following number of Dimensions:
    Dimensions     Number of Levels     Number of Level 0 Members     Number of Attribute Dimensions
    Dimension1     2     6.5 million     15
    Dimension2 1     3     
    Dimension3     1     4     
    Dimension4 1      6     
    Dimension5 1     6     
    Dimension6     1     5     
    Dimension7 1     3     
    Dimension8     5     1700     
    Dimension9     2     800     
    Dimension10 2     40000     
    Dimension11 3     750     
    Dimension12 2     34000     
    Dimension13 1      15     
    The number of Measures is 8.
    The outline size is around 2.12 GB.
    The data is mostly sparse. Does this design yield a good performance. Should I change some of the attributes to UDAs to increase the performance.I think Attribute dimensions are more flexible than UDAs but it affects the performance of the retrieval.
    Thanks in advance.
    Kannan.

    In ASO attribute dimensions are treated like regular dimensions, That is to say they are materalized just like a regular dimension. Changing them to UDA's won't buy you the same performance as having them as dimensions will. The one nice thing about Attributes in ASO cubes (and BSO cubes) is you don't clutter us the screen with dimensions that are not used a lot. If your attrubite dimensions are used often in your ASO cube, there would be no performance difference if you made them regular dimensions.

  • Clearing partially a document using POSTING_INTERFACE_CLEARING

    Hello.
    I want to clear <b>partially</b> a financial document using POSTING_INTERFACE_CLEARING (transaction FB05), but when I try to do it, I obtain an error "No data for SAPDF05X dynpro 3100".
    I can clear it completely.
    Is it possible to clear partially a financial document?
    Thanks very much.

    Hi Friends,
    I had the same problem (partially) and solved it by the FM with a simple solution.
    When the value is not same, you can post other document, to costumer like this:
    Batch Input Values
    lt_ftpost-stype = 'P'."Header
    lt_ftpost-count =  2. "number of Dynpro
    lt_ftpost-fnam = 'RF05A-NEWBS'.
    lt_ftpost-fval = '40'. "
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    lt_ftpost-count =  2. "number of Dynpro
    lt_ftpost-fnam = 'RF05A-NEWKO'.
    lt_ftpost-fval = gl_account "G/L account.
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    lt_ftpost-stype = 'P'."Header
    lt_ftpost-count =  2. "number of Dynpro
    lt_ftpost-fnam = 'BSEG-WRBTR'.
    lt_ftpost-fval = '600,00'. " the partially amount.
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    Batch Input Values
    lt_ftpost-stype = 'P'."Header
    lt_ftpost-count =  3. "number of Dynpro
    lt_ftpost-fnam = 'RF05A-NEWBS'.
    lt_ftpost-fval = '01'. "costumer account.
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    lt_ftpost-count =  3. "number of Dynpro
    lt_ftpost-fnam = 'BSEG-HKONT'.
    lt_ftpost-fval = '9000125'. "costumer account
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    lt_ftpost-stype = 'P'."Header
    lt_ftpost-count =  3. "number of Dynpro
    lt_ftpost-fnam = 'BSEG-WRBTR'.
    lt_ftpost-fval = '400,00'. "residual amount
    *Same type as documents cleared via F-32
    APPEND lt_ftpost.
    Documents to be cleared
    lt_ftclear-agkoa = 'D'. "Account Type
    lt_ftclear-xnops = 'X'. "Indicator: Select only open items which are not special G/L?
    "LT_FTCLEAR-XFIFO = 'X'.
    lt_ftclear-agbuk = p_bukrs. "Example company code
    lt_ftclear-agkon = p_kunnr. "Example Customer
    lt_ftclear-selfd = 'BELNR'."Selection Field
    lt_ftclear-selvon = p_doc1. "document selected
    lt_ftclear-selbis = p_doc1.
    APPEND lt_ftclear.

  • Clearing the data on logout

    Hi all,
    Flex newbie here.. I just want to know how to clear the data
    that i fetched from server on logout. My application have multiple
    components and each component have different states. Once i logout
    and login, am getting the previous state before i logout. How to
    solve this problem? please help me out.

    I'd start with the "loginResultHandler" function and use it
    to direct the user to a specific component within a ViewStack
    whenever they login or log back in.
    For example:
    [Bindable]
    private var currentUser:User = new User();
    private var currentFunction:Function;
    private function loginResultHandler(event:ResultEvent):void {
    currentUser = event.result as User;
    if (currentUser.loggedIn)
    if (currentUser.roles == "admin")
    Application.application.myViewStack.selectedIndex=0;
    else
    Application.application.myViewStack.selectedIndex=1;
    loginForm.visible=false;
    currentFunction.call();
    else
    Alert.show("Login unsuccessful", "Server Authentication");
    This may or may not be the answer you are looking for but it
    should get you going in the right direction. Check out some of the
    great Flex tutorials at Lynda.com (
    http://movielibrary.lynda.com/html/modListing.asp?pid=205)
    Good luck!

Maybe you are looking for

  • How to validate a item in page based on previous item in same page

    Hi, I need to know how we can hide items based on page item ie i have 5  item in same page like weighed(radio button) yes / no if user select yes na it ll automatically enable remaining fields like weight,packing charges, courier charges and net tota

  • Dynamic column selection

    I want to select specific columns from a table dynamically which are basically the columns returned by quering the user_tab_columns data dictionary which in turn has to satisfy certain condition for me. So, what I want to do is something like: SELECT

  • How do I download a book from Itunes to my ipod nano?

    I see that I've purchased a book on my ITunes Store list, but it does not show up on my IPod nano.  Can you help?

  • SQL Developer 3.x Disconnection Problem

    We are running SQL Developer 3.1 on Windows 7 64 and connecting to a 11.2.0.3 database. The user's can connect fine, then they can click through and see their tables. Once they click on the tables to see the columns or data, we immediately get discon

  • PU19 recording

    Hi, I need to call PU19(module pool program) from my prog and want to write the pdf form to application server(without displaying to the user). So I am doing BDC recording of PU19 but not able to do that. I have to generate W2 form (from Annual tab)