SSAS Source - Column selection lost when modifying an existing "Add Item" step

When developing a Power Query query on an SSAS source, I often need to edit an existing "add items" step in order to add a few columns that I had initially missed out. To do this, I point to the step previously created and click on the small gear
icon to edit it.
However, when doing so, the edit dialogue is not initialised with all the columns that I had previously selected. Which means that if I am not careful and do not scrupulously recheck every column that I need, the modification will destroy what I had created
before.
Inserting another step rather than modifying an existing one might be a workaround,  but I would rather keep the number of steps to the minimum. Also, if there is an edit button, one should be able to rely on it.
Has anyone experienced the same issue?
PQ Version: 2.18.3874.242 - Excel 2013 32bits.

This is a known issue that will be resolved in an upcoming release.
A work-around for now when using the UI builder to edit a selection is to expand dimensions that had an attribute added from it before dismissing the dialog.

Similar Messages

  • Why request "afr/column-select.cur" when move mouse?

    I runed a ADF page with a ADF table component, and I notice a strange phenomenon:
    When I moved mouse on the ADF table each time, it would send a reqeust "afr/column-select.cur".
    I guess it is the cursor defination file for ADF table, but why send reqeusts to the server for this file each time?
    Does it affect the performance?

    Hi,
    I think you should "clean-All" your project first and then rebuilt.Make sure you have not applied any clientListener on the form.
    Hope it helps.
    Regards,
    Shah

  • "The server responded with an error" when modifying or deleting an item

    I can't modify or edit an item in my MobileMe syncronized calendar. I can move the item to a different date or time, but if I try to delete it or remove an attendee, I get the following error:
    The server responded with an error.
    The request for “***” in “Calendar” in account ***.***@me.com” failed.
    The server responded with
    “500”
    to operation CalDAVWriteEntityQueueableOperation.
    I can't delete the item, or remove attendees from me.com either.
    If I create another appointment, I can delete it, so I suspect it has something to do with the fact that I have invited attendees.
    Any suggestions?

    Are you behind a firewall at work or elsewhere?  Does you calendar sync with some other system like google (could be the google server is not responding)?  All we know from the message is that your mac cannot reach the server.  Maybe try later.  Are you using wifi or have an ethernet connection to the router?

  • When I want to add items to the bookmarks toolbar, for example: -Zoom toolbar. The zoom Toolbar appears on the bookmarks toolbar, but it shows the favicons aswell the underlined text. I only want the favicons. How to do this, please help!!

    I want to add items to the bookmarks toolbar. for example: the Zoom Toolbar addon. This works, but not only the favicons appear on the bookmarks toolbar, aswell as the underlined text. How to avoid this?
    I only want the favicons from an addon to appear on the bookmarks toolbar, not with the underlined text. How to do this?
    This problem doesn't happen with any other toolbar. And this problem didn't happen with an earlier version of firefox. Please Help!
    Thanx, BassMann.

    If you only want the favicons and not the names of the bookmarks on the bookmarks toolbar, you can do that with the [https://addons.mozilla.org/en-US/firefox/addon/4072/ Smart Bookmarks Bar] add-on.

  • Content Cannot Be Displayed in a Frame When Trying to Add Item

    We have a user that is trying to add an item to a SharePoint Calendar.  She using using IE9 browser in IE9 compatibility mode.  When she selects a date then clicks on "add item", the dialog box pops up, but it displays "The Content
    cannot be displayed in a frame".  When I try adding and item it works fine for me. 
    I'm at a loss on this one.  Has anyone experienced this or know what is going on?
    Thanks

    To resolve this issue, follow these steps:
    Click Start, click Control
    Panel, and then click Internet Options.
    Note Make sure that you view Control Panel by
    using either small icons or large icons.
    Click the Security tab, click Trusted
    sites, and then click Sites.
    Enter your SharePoint Url, click Add,
    then click Close.
    Make sure that the Enable Protected Mode check
    box isn't selected.
    Close all web browsers, and then try again with a new session.
    Ibrahim Sukari, Technical Consultant | SharePoint | Dynamics CRM |
    LinkedIn Profile

  • Different measures use same source column but gives different values in SSAS project

    I've inherited a SSAS project. I'm currentlig gaining knowledge, but do not understand this: There are 4 measures that get data from same source table and column. Example measures:
    [Started]
    [Processing]
    [Processed]
    [Finished]
    All of them gets data from a table (view), and a column "Event_count" with just value 0 in each row. Usage of the measure is "Count of non-empty values".
    Correctly all of these measures gives different values, but I don't understand how since they all go against the same source column. I thougt maybe they were defined in Calculated measures, but nothing there. Where or what else could be affecting this behavior?
    Bonus information (problem): One of the measures is returning blank/empty, which is why started with this research at all.
    Help anyone?
    regards .r

    Hi ,
    I would ask if all the measures are under the same measure group ? It is important to know, since in the dimension usage we are creating relationships between dimensions and measure
    groups .
    Did you try the delete all the calculation's code and leave only the " CALCULATE ; " part ? Do the values change after deploying the slim script ?
    Are all properties the same ? defaults & caching ?
    Regards, David .

  • Error when modify several times a column -- CHANGED_GLOBAL_INSYNC

    Hello, i have a problem.
    I have sync in background, i´m modifying a column several times, the first modify work fine, the second work fine too, but sometimes, appear problems from the third time. Why?
    The error in console is:
    "Creation on Field cZSBOE002_TOP_STATUS with value N is not allowed because SyncBo of Row 0001014016 is CHANGED_GLOBAL_INSYNC and CreateInputQualifyType is FORBIDDEN"
    The syncKey is good.
    What could be the cause? How can solve it?
    Thanks,

    Hi Victor,m
    the error message is the answer to your question:
    0001014016 is CHANGED_GLOBAL_INSYNC
    Check the status of the BO - there you will see that you have three main stats:
    Global
    Local
    Insync
    Problem in your case - and I always told you to be carefull with auto sync:
    You change an item - if you change iot again before the sync - or after a complete sync when the answer for that item comes back from the backend - then all is fine. Just wait one more sync circle and change the item then - all is fine.
    Now the solution:
    ok, you create an item  - item is in LOCAL. As long as it is on Local you can change it again and again. As often as you like.
    Not the sync starts. The item goes into INSYNC state. This means, any change is forbidden - this is to avoid any inconsistency between backend and client - cause the backend wants to send its answer back to the client. Wait for that answer - usually with the next sync this comes down to the client
    Then the item goes to GLOBAL - and now you can change it again - until the next sync puts the changed item back into state INSYNC - see above.
    Well, that is all - check the status of the BO and change it only while it is GLOBAL or LOCAL.
    Hope that helps.
    Regards,
    Oliver

  • Error when modifying tabular form

    Hi
    In Apex 3, I created a tabular form based on the query below:
    select user_id /* PK */, name, dob
    from users
    It works perfectly well; I am able to update the name and dob fields.
    Then I changed the query to:
    select user_id /* PK */, name, null dob
    from users
    ...(i.e. not interested in the dob column anymore but want to keep the report column) and cleared the properties reference table owner/name/column name of the dob report column.
    However, when I try to update the name of a user, I get the error:
    Error in mru internal routine: ORA-20001: Error in MRU: row= 1,
    ORA-20001: ORA-20001: Current version of data in database has changed since user initiated update process.
    current checksum = "7C3CD9C5DA0B5AF45AE616959020A0AA",
    item checksum = "6910D5C7015D84773A9E8F65299991F6".,
    update "SCHEMA"."USERS" set "USER_ID" = :b1, "NAME" = :b2, "DOB" = :b3Why is dob still being included in the update statement?
    Did I miss something?
    Thanks,
    Luis

    Since the DOB was one of the columns specified when you created the tabular form, the internal PL/SQL process that it uses for making updates is still trying to update that column. As you have now modified the underlying query for the form and are passing a NULL for your DOB, you will find that any update made using this form would set the DOB to NULL, so beware!
    Unfortunately, once you have created a tabular form, you cannot make changes to the process that it uses for manipulating data. Another drawback of the tabular form is that it uses the primary key of the table as a way of knowing which record it is updating when it processes the changes. As a result, you cannot modify the primary key value itself - and that is why you are getting the error.
    Generally speaking it is not a good idea to edit primary key values. If this piece of information does require editing, I would suggest you find another candidate for the primary key - perhaps a standard sequence-populated column that could act as a surrogate key?
    Cheers,
    Colin

  • How to create a dynamic RTF report which creates dynamic columns based on dynamic column selection from a table?

    Hi All,
    Suppose I have table, whose structure changes frequently on daily basis.
    For eg. desc my_table gives you following column name on Day 1
    SQL > desc my_table;
    Output
    Name
    Age
    Phone
    On Day 2, two more columns are added, viz, Address and Salary.
    SQL > desc my_table;
    Output
    Name
    Age
    Phone
    Address
    Salary
    Now I want to create an Dynnamic RTF report which would fetch data from ALL columns from my_table on daily basis. For that I have defined a concurrent program with XML as output type and have attached a data template/data definition to it which takes in XML as input and gives final output of conc program in EXCEL layout. I am able to do this for constant number of columns, but dont know how to do it when the number of columns to be displayed changes dynamically.
    For Day 1 my XML file should be like this.
    <?xml version="1.0" encoding="UTF-8"?>
    <dataTemplate name="XYZ" description="iExpenses Report" Version="1.0">
    <dataQuery>
    <sqlStatement name="Q2">
    <![CDATA[
    SELECT Name
    ,Age
    ,Phone
    FROM my_table
    ]]>
    </sqlStatement>
    </dataQuery>
    <dataStructure>
    <group name="G_my_table" source="Q2">
      <element name="Name" value="Name" />
      <element name="Age" value="Age" />
      <element name="Phone" value="Phone" />
    </group>
    </dataStructure>
    </dataTemplate>
    And my Day 1, EXCEL output from RTF template should be like this.
    Name     Age     Phone
    Swapnill     23     12345
    For Day 2 my XML file should be like this. With 2 new columns selected in SELECT clause.
    <?xml version="1.0" encoding="UTF-8"?>
    <dataTemplate name="XYZ" description="iExpenses Report" Version="1.0">
    <dataQuery>
    <sqlStatement name="Q2">
    <![CDATA[
    SELECT Name
    ,Age
    ,Phone
    ,Address
    ,Salary
    FROM my_table
    ]]>
    </sqlStatement>
    </dataQuery>
    <dataStructure>
    <group name="G_my_table" source="Q2">
      <element name="Name" value="Name" />
      <element name="Age" value="Age" />
      <element name="Phone" value="Phone" />
      <element name="Address" value="Address" />
      <element name="Salary" value="Salary" />
    </group>
    </dataStructure>
    </dataTemplate>
    And my Day 2, EXCEL output from RTF template should be like this.
    Name     Age     Phone     Address     Salary
    Swapnill     23     12345         Madrid     100000
    Now, I dont know below things.
    Make the XML dynamic as in on Day 1 there must be 3 columns in the SELECT statement and on Day 2, 5 columns. I want to create one dynamic XML which should not be required to be changed if new columns are added in my_table. I dont know how to create this query and also create their corresponding elements below.
    Make the RTF template dyanamic as in Day1 there must 3 columns in EXCEL output and on Day 2, 5 columns. I want to create a Dynamic RTF template which would show all the columns selected in Dynamic XML.I dont know how the RTF will create new XML tags and how it will know where to place it in the report. Means, I can create RTF template on Day 1, by loading XML data for 3 columns and placing 3 XML tags in template. But how will it create and place tags for new columns on Day 2?
    Hope, you got my requirement, its a challenging one. Please let me know how I can implement the required solution using RTF dynamically without any manual intervention.
    Regards,
    Swapnil K.
    Message was edited by: SwapnilK

    Hi All,
    I am able to fulfil above requirement. Now I am stuck at below point. Need your help!
    Is there any way to UPDATE the XML file attached to a Data Definition (XML Publisher > Data Definition) using a standard package or procedure call or may be an API from backend? I am creating an XML dynamically and I want to attach it to its Data Definition programmatically using SQL.
    Please let me know if there is any oracle functionality to do this.
    If not, please let me know the standard directories on application/database server where the XML files attached to Data Definitions are stored.
    For eg, /$APPL_TOP/ar/1.0/sql or something.
    Regards,
    Swapnil K.

  • Get OLD&NEW value of an UPDATED column selected dynamically in a Trigger

    Hi All,
    I am writting a trigger which take column name dynamically. And on the basis of that column it should give me old value as well as updated value of a column corresponding to a modified row.
    OOO_SCHEDULE is my table name;
    Note: This is only for test so I am writting only for update not for insert and delete.
    create or replace trigger "OOO_SCHEDULE_AUDIT"
    BEFORE
    insert or update or delete on "OOO_SCHEDULE"
    for each row
    begin
    DECLARE
    v_username varchar2(30);
    AUDIT_EMP_ID varchar2(30);
    AUDIT_EMP_ID_NEW varchar2(30);
    v_Column_name VARCHAR(30);
    v_stmt1 VARCHAR(40);
    v_stmt2 VARCHAR(40);
    CURSOR C1 is
    select COLUMN_NAME from user_tab_columns where table_name='OOO_SCHEDULE';
    BEGIN
    OPEN c1;
    LOOP
    FETCH c1 into v_Column_name;
    EXIT WHEN c1%NOTFOUND;
    v_stmt1:=('OLD.'||v_Column_name);
    v_stmt2:=('NEW.'||v_Column_name);
    AUDIT_EMP_ID:=v_stmt1;
    AUDIT_EMP_ID_NEW:=v_stmt2;
    INSERT INTO TEMPTEST VALUES(v_stmt1);
    INSERT INTO TEMPTEST VALUES(AUDIT_EMP_ID);
    END LOOP;
    CLOSE c1;
    END;
    end;
    Suppose OOO_EMP_NAME is the column name where user made the change.
    If i do like this..
    AUDIT_EMP_ID:=OLD.OOO_EMP_NAME;
    AUDIT_EMP_ID_NEW:=NEW.OOO_EMP_NAME;
    Then it is working fine because I have given column name statically. But I want the column name to be selected dynamically and I am able to do it through cursor. Also I am able to fetch all column names in v_Column_name variable one by one dyanamically for the cursor.
    But by executing these statements
    AUDIT_EMP_ID:=v_stmt1;
    AUDIT_EMP_ID_NEW:=v_stmt2;
    I am getting OLD.OOO_EMP_NAME and NEW.OOO_EMP_NAME rather then old and new values of the updated column.
    Please help me identifying the problem, where I am doing the mistake? What is the correct way to execute these statements? So that I can get old and new values of the column (updated column).
    I have tried it by passing in a procedure also but don't know how to execute this dynamic statement to get the old and new values.
    Thanks,
    Ishrat.

    In the given link, column name has been selected statically. But i want that column name should be selected daynamically throgh loop and then check the
    condition for any update corresponding to that column value. I don't want to write as many if condition as the no. of column name. I just want one if condition for all column namesDon't be lazy. Write all column names into your trigger. Or use a way to create the trigger "dynamically".
    What is the problem that you have with static column names? "I don't want to write many..." is not a problem, but an opinion.

  • CD Does Not Appear in Source Column

    When using iTunes 4.7.1, inserting an audio CD would automatically open iTunes, and the CD would appear in the Source column.
    I just upgraded to iTunes 7.0.2, and I've double checked the CD & DVD System Preferences, and went to: iTunes/Preferences/Advanced/Importing and selected: On CD Insert: Show CD.
    I've tried several different commercial CD's and not one shows up or can be played.
    I've checked other posts, and there seem to be others with this exact same problems, but I haven't seen any advice that works.
    Does anybody have a better solution than a downgrade to 4.7.1?

    I realize the reasonableness of your suggestion, but creating a new user means I won't even recognize my computer. Even if this restores functionality to iTunes 7.0.2, then what will I do?
    I MAY wind up with a functioning iTunes, but I may not, and even if it does work, nothing else will.I'll be spending days to get my computer back,and theniTunes 7.0.2 will all of a sudden stop working, just as it has for many others.
    I've already downgraded 4.7.1, which I fortunately saved as .zip.It doesn't look as nice, but at least it plays CD's,and has all the other features I need. Other than looks, I haven't figured out what the latest version will do for me.

  • Array data sporadically lost when calling stored procedure

    We have weird problem with stored procedures for which I have not been able to find an explanation so far.
    Our app has a bunch of stored procedures which process arrays of integers or strings.
    From time to time (maybe 1 in a 1000) array data gets lost when calling stored procedures. From the logs we know that the data gets sent to the stored procedure, but on the DB side the procedure throws an exception because arrays are empty.
    We are using (modified) OracleConnectionCacheImp (thus PooledConnections, etc.). My latest guess is that maybe somehow ArrayDescriptors on the (pooled)connection used to call the stored procedure are somehow affected when another pooled connection from OracleConnectionCacheImp gets removed by parallel thread. Is that possible? Or are the PooledConnections internally (in pooled connection data source) completely isolated from each other ?

    We have weird problem with stored procedures for which I have not been able to find an explanation so far.
    Our app has a bunch of stored procedures which process arrays of integers or strings.
    From time to time (maybe 1 in a 1000) array data gets lost when calling stored procedures. From the logs we know that the data gets sent to the stored procedure, but on the DB side the procedure throws an exception because arrays are empty.
    We are using (modified) OracleConnectionCacheImp (thus PooledConnections, etc.). My latest guess is that maybe somehow ArrayDescriptors on the (pooled)connection used to call the stored procedure are somehow affected when another pooled connection from OracleConnectionCacheImp gets removed by parallel thread. Is that possible? Or are the PooledConnections internally (in pooled connection data source) completely isolated from each other ?

  • About Query Data Source Columns property

    Hello everyone,
    I'm new to Oracle Forms version 10.1.0.2.
    When I create a data block based on a table using Data Block Wizard, the block's Query Data Source Columns property is automatically populated with column definition entries corresponding to the columns of the base table.
    I tried making changes to these entries, for example by changing the data types to wrong data types or even deleting them, and I found that those changes had no effect on the block at all. The form was still working as I wanted.
    Please explain what is exactly the role of the block's Query Data Source Columns property.
    Thank you very much.
    p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.

    p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.
    IMHO those properties are very self-explaining: It is the data source of the block, or in other terms: how it is populated.
    Table means the data block is based on a table and subsequently will be populated by
    select col1, col2, col3 from your_table
    With sub-query the block will be populated with your subquery; forms will issue
    select col1, col2, col3 from (
      -- this is your subquery
      select col1, col2, col3 from tab1, tab2 where [....]
    With Procedure in short you'd have a stored procedure which returns a ref cursor and the block will be populated by the ref cursor.
    As for your question about the name: this actually should matter; the default is NULL which means that there needs to be a column which has the exact name as the item so in the above sample with table the item associated with your_table.col1 should be named col1. If it isn't the property should be named like the column. If this property also doesn't reflect the name this shouldn't work IMO.
    cheers

  • Continuous, source/destination selectable, time-stamped backup: WonderSave?

    I've spent a significant amount of the last six years (5000+ hours) in front of my Mac putting together a book. I'm very careful about backups: documents are saved several times an hour while working on them, then backed up twice daily to five hard drives in rotation, then weekly to two sets of DVDs. Never lost an ounce of work -- until yesterday. Something happened to several of the documents I was working on before I did a backup. They all had a size of 0 bytes, and when I did backup...
    In two hours redoing the work, I had the documents back but was I annoyed at the waste of time. So I need a piece of software which I am naming WonderSave. This is how it works:
    *Step 1*: I fire up WonderSave and it allows me to select any area (the Source) on any of my hard drives that I want WonderSave to keep an eye on. This selection process is similar to how I can select multiple areas when using OSX to Find.
    Step 2: I select where I want the backups to appear (the Destination).
    Step 3: I tell WonderSave how long to keep the files before deleting them.
    In operation, every time I save into my Source, WonderSave simultaneously saves a time-stamped copy in my Destination. For example, if during the day I work on 15 files in four different programs and save a total of 67 times, I will find a have 67 files in my Destination.
    Gladden my heart, someone, by telling me the real name of this wonderful piece of software.

    No, not the ones I mentioned. You could upgrade to Leopard and use Time Machine which does backup every hour. The ones I mentioned can be scheduled to backup as frequently as you want, but this imposes some overhead for constantly checking the drive for changed files. Not overly efficient.
    There are two backup programs that do that, however, called TimeDrawer and NTI Shadow. They do not create bootable clones, but they do perform versioned backups as you describe. You will find them at VersionTracker or MacUpdate.
    There is no backup solution that does everything. You may need to implement more than one to accomplish both automatic backups when a document is saved and create bootable clones. For the latter there are some good alternatives:
    LaCie's new SilverKeeper 2.0 (free)
    SuperDuper! (commercial)
    Carbon Copy Cloner (donationware; free to academics)

  • Cluster value lost when leaving case structure

    Hello all,
    I am having an issue with a particular VI I am working on. (FYI this is in Labview 8.0)
    The VI is set up to run a current source, voltmeter, and thermometer and then record and graph data in various ways that are selectable by the user.
    Everything seems to be working fine except one particular graphing method. When I try to graph Current vs. Voltage, the current value is gettiing lost. Within a case structure the current and voltage are combined into a cluster and then exit the case structre to be later appended to a cluster array.
    However, the value of the cluster is being lost when the graph is set for Current vs. Voltage. Any other method and it is working properly.
    I can't seem to figure out why this is happening as it does not make much sense that the other methods of graphing work but this one doesn't when it is coded the exact same other than having different variables.
    I have attached the VI with added indicators showing that the cluster value is being lost once it exits the case structure. Any help would be much appreciated.
    Thank you very much and I apologize for my extremely messy VI.
    - Nathan Cernetic
    Solved!
    Go to Solution.
    Attachments:
    IV-RT.vi ‏185 KB

    You really need to boil this down to something we can run and reproduce the issue.
    Have you tried to break a few relevant connections and rewire? Maybe something is corrupt.
    Beside being a mess, your VI has quite a few glaring mistakes.
    Let's have a look at the structures in the upper right:
    90% of the code is the same in all three case structures, so all that needs to be inside the case structure is the small part where you built the 2D array. Everything else belongs outside the case.
    The sequence structure has no useful function.
    Why do you need to reverse the array with each iteration of the small FOR loop? Once before the loop would be enough. Right?
    Why do you use built array inside the small FOR loop, but don't autoindex at the loop boundary? All you ever get is an array with exactly one element, no matter how much the loop runs. Seems useless! (see below image for an alternative).
    You would not even need to reverse if you would use "built array" instead of "insert into array at position 0" in the central part.
    Others: You constantly hammer all your property nodes. The only need to be called when things change. Again, you have way too many instances. For example in the case structure where you are having problems, the same property nodes exist in all cases. A single instance of the property node belong outside the case and only the string constants inside each case. Whenever code is the same in all cases of a case structure, that code belongs outside!
    Message Edited by altenbach on 06-19-2009 03:39 PM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    RubeLoopII.png ‏10 KB

Maybe you are looking for

  • Stupid question on checking data types

    Hi I want to write a function that can accept a parameter which could be an int or an int array. This is what I have so far: void translateParams(Object obj) { Class type = obj.getClass(); if (type.isArray()){ //do something Two questions - firstly a

  • Can you have more than one apple id on one mac?

    Also i was wondering if you can create a separate apple id but keep your music/podcasts/audiobooks and transfer them to the new apple id.  Please help

  • ISE 1.2.1 Installation fails

    Hi, I have a ISE POC that I need to get running for a customer, but the installation seems to fails everytime. This runs on a VMWare environment. The install goes through but after final reboot and system restart none of the services is starting up.

  • Goal vs Actual Report

    We have a report requirement which involves charts and I was wondering if someone could provide me with some pointers to the same. My Data would look like this Week Ending Goal Actual 1/2/2009 10 0 1/9/2009 10 0 1/16/2009 10 7 1/23/2009 10 5 The chal

  • Error in Adobe Reflow Dom

    Hi, When I downloaded adobe reflow a week ago, I found it fantastic. However a few days ago, the whole of the Dom errored and randomised itself. Undo wouldn't fix it and I had to revert to a previous version. This then promptly happened again in the