Data manager-limited authorization

Hello experts,
I would like to define a user with limited access to the data manager. The user can view all records but can only upload images (JPG, TIF, GIF) to the Images table.
I created new role as set all access rights to NONE execpt from the following:
Tables - Image Variant - Read\Write
Function Tab - Images - Execute
When I assign this role to a user they can access the Data Manager, access the Images table but when they select add image they get the following error:
"Importing (file name) Failed...
Insufficient rights for operation"
What are the rights i need to assign to a user in order for them to only upkoad images but not to create other records?
Thank you all in advance
Limor

Hi Limor,
Try by doing the following things:
Functions Tab: Assign all the records related function to execute like add, delete, mdify etc
Tables & fields Tab: Asign Read/Write access only to Images and Datagroups
I hope I am able to solve your problem.
Thanks & Regards
Dilmit Chadha

Similar Messages

  • Data Manager Limitation.

    Hello Experts,
    I wonder if there is any limitation in Data Manager. I am asking because I try to generatte a Report using DATA MANAGER, but after about 30 mins running the package it fails showing the next error:
    Task name TRANSACTION DATA SOURCE:
    MDX statement error: An error occurred when getting data from the processor.
    model: PRESUPUESTO. Package status: ERROR
    or the other says
    (Content was truncated)
    any idea if DATA Manager has any limitations.
    Regards.

    It is the problem now. I can´t run the package bc the previuos selection (that was a huge selection)  got saved by itself.
    it doesn't have the selection <all> but the previos selection i did. when I select run, the package get stuck.
    how do i restart the selection to the default.??

  • Final user's can not see the data due to limited authorization.

    We have created a InfoSet with three info Objects, 0Account, 0Costcenter and 0COMP_CODE. 0Costcenter have an attribute retail location  0RT_LOCATIO.
    0RT_LOCATIO is an authorization relevant object. We as consultants can execute the infoset properly, but final user's with limited authorizations can not see the data because of authorization failier
    We hae several options to solve the issue, deleselect the auth. flag in the infoobject; delete the infoobject from the attributes of the cost center or create an authorization object and assign it to the final user's profile. But we don't want to go that way.
    My question is, is there any way to avoid including this attribute in the infoset definition? We are not using it in the query and we don't need it, so if we could delete it from the infoset (in the same way you add or delete infoobjects from an Infocube) without changing the cost center aster data, we will have our problem solved.
    Does anyone how to do this (if possible)?
    Thanks in advance!

    Just do two things to find the authorization check failed for that user.
    1. Execute SU53 output and find out the authoirzation check failed. If yes, please send that to BASIS Team.
    2. Next one, switch on the authorization trace in ST01 and ask that user to see that data. if the user is failed with authorization issue. switch off the trace in ST01 and find out the issue.
    Do this way, if it is not successful you can go for any other alternate way.
    Hope this would help you.

  • Data creation in data manager vs. ERP transaction

    Hi experts,
    I have a general question on working with MDM: When I use MDM as central data management tool then I create and change the whole data within MDM Data Manager or maybe in MDM iViews in EP. Doing that way I will loose a lot of positive aspects I had doing this in ERP formerly even when using business content.
    For example:
    - no process logic exists (which I had in ERP dynpros in the code): no preconfigured assignments, validations, relationships, etc.
    - I have to know all relevant tables where I have to do my entries
    - no standard searches for typical problems
    - all roles and authorizations have to be created on my own (except standard roles like Admin, Data Expert, etc.), there is no possibility to transfer authorizations from ERP to MDM (authorizations can be very complex)
    - you need more (expert) know-how to maintain data (which is also a job of business departments)
    - etc.
    So a change to central master data management means a step backwards from a dynpro controlled input GUI to a database frontend with only rudimental features and loosing a lot of functionality that was build within a lot of years.
    Is this right or did I miss something important? I think loosing all this functionality can be a cause for resistance against introduction of MDM central data management in companys.
    Thanks for your answers. Helpful answers will be rewarded.
    BR, bd

    Hi BD,
    CMDM is one of the later stages in using MDM effectively.Before moving to a CMDM scenario it is necessary that ground base is set up.
    - One goes for a CMDM scenario once all the one time data(exisiting data) is already cleansed.So firstly you need to extract all the possible high risk duplicates existing data from the ERP sytem into MDM ,run the matching merging strategies on them and then maintain the consistent data in MDM.Once this step is completed only then it makes sense to go for the creation of new data centrally through MDM.
    - Just like in ECC when a user creates a new material for instance.He has to go through all the views and enter in all the required and optional fields value.In the same way this same scenario can be replicated in MDM by creting a repository with all the fields that one needs to enter while creating a new material. and by using MDM Validations we can simulate the same requirement of mandatory as in ECC.
    - If you go for a Business content repository for Material,Vendor,Product etc you will have all the ready to use Roles with authorizations so its no a much of rework.
    - Standard searches amy not be available in MDM but one can always desig his/her own search based on the customer requirement and MDM searches are very easy to use and also very dynamic.
    I do agree that SAP MDM may not be a fully grown tool at this point of time but with the SAP MDM 7.1 vs most of these drawbacks will be addressed. and besides although ECC can do most of the Master data related work it is not a dedicated system for that purpose and will therefore have an influence on the performance and time involved aspects.
    Hope It Helped,
    Kindly Reward Points if found useful
    Thanks & Regards
    Simona Pinto

  • Data management software

    Hi ladies ´n gents,
    i´m looking for a data management software, supporting all types of text, photos, music, video and free tagging.
    Something like acdsee for photos, but more universal content. Any idea? Help please.

    Jpgs are raster images, and that's what will happen to the text. If you're limited to the file formats described, you could test exporting to eps and converting to wmf or emf
    (search online for "eps to wmf" or "eps to emf" converter)
    To get the text to look good it needs to stay a vector, so those are your only file options.
    Fringe Facts clearly isn't meant for "high end" printing, so I doubt there's a perfect solution. I'll bet you'll have to keep the design fairly basic (no drop shadows or transparencies) to get the files to work. I've never tried to create a wmf or emf file on purpose, so who knows what the final final will look like.

  • Search for [Remote Key] and [Remote System] in Data Manager

    Hello all
    I would like to be able to search on the remote key and the remote system in the MDM Data Manager is that not possible? I thought I remembered seeing that possibility under the Free-Form Search but now I can't find it.
    I have, however, found this in the Data Manager reference guide:
    REMOTE SYSTEM AND REMOTE KEY FIELDS
    MDM uses the remote systems defined in the Remote Systems table
    within the MDM Console to store and maintain key mapping information
    for each record or text attribute. It does this using a virtual “key
    mapping” field that you never see in the MDM Client.
    This virtual key mapping field is very much like a qualified lookup field
    into a virtual key mapping qualified lookup table.
    Key Mapping information stored in virtual lookup field
    The Remote System and Remote Key fields are normally not visible;
    however, they do appear in several places in the MDM Client.
    Specifically, both fields: (1) appear in the File > Export dialogs in Record
    mode for exporting value pairs; (2) are recognized by the File > Import
    dialog in Record mode for importing value pairs; and (3) appear in the
    Edit Key Mappings dialogs in both Record mode and Taxonomy mode,
    for viewing and editing value pairs.
    Is there any way to search on the value in the remote key from the Data Manager?

    Not sure search i think not possible.
    But you can see keys as mentioned:
    Enable Key mapping in Console.
    MDM Client maens MDM Data Manager.
    They do appear in several places in the MDM Client or Data Manager. Three different methods to see in DM are given already below:
    Specifically, both fields: (1) appear in the File > Export dialogs in Record mode for exporting value pairs; (2) are recognized by the File > Import dialog in Record mode for importing value pairs; and (3) appear in the Edit Key Mappings dialogs in both Record mode and Taxonomy mode, for viewing and editing value pairs.
    BR,
    Alok

  • Error while opening MDM Data Manager

    Hi,
    We are getting the following error while trying to open Data manager on our repository.
    "Error Initializing Attributes for table Taxonomy
    Application will exit
    Already exists"
    mds Version 5.5.42.90
    MDM DataManager Ver5.5.42.90
    The repository loads without any errors. The server log file too doesnt show any errors.
    Is there anyway to fix this?

    Go to your MDM console, Login to repository and Unload repository first.
    Then load with Update Indices.
    This problem comes when you change anything in schema in console and then load repository with immediate option.
    So always prefer to use update indices as accelator files will get updated and data comes from actual database.
    If the problem still persists then Go again to console and check your repository for any fatal error. if errors come then repair repository and if not come then also repair.
    Then load with update indices.
    Hope it will help you.
    BR,
    Alok
    Edited by: Alok Sharma on Feb 14, 2008 9:20 AM

  • Not able to see data in the qualifier table of the main tbl , Data Manager

    Hi,
    I have an issue of not able to see the data of two qualified table after populating them.
    It is in mdm-5.5 ps4.
    When populating data first time ,it shows up in those two table slots in the right side of the Data Manager.
    However subsequently it does not show up in those slots , only by right click on the table and selecting "View/edit", the window pops up where those data shows up.
    However unlike other qualified tables the data does not showup automatically for these two tables.
    Appreciate any suggestion or feedback on this.
    regards,
    -reo

    You may have checked the Filter Check Box next to the Qualified Lookup cell in Data Manager, when the current table is the Main Table.
    You use the Filter Checkbox to limit the qualified table records by the current search selections.
    Secondly, you have see if there are any Qualified Links to the main table record you are viewing.
    If not, create the Qualified links in Data manager, for the main table record and the Qualified Table Record.
    Once this is done, you will see the Display fields of the Qualified table for which the links exists for the given main table record.
    Message was edited by:
            Adhappan Thiagarajan

  • Unable to access the data from Data Management Gateway: Query timeout expired

    Hi,
    Since 2-3 days the data refresh is failing on our PowerBI site. I checked below:
    1. The gateway is in running status.
    2. Data source is also in ready status and test connection worked fine too.
    3. Below is the error in System Health -
    Failed to refresh the data source. An internal service error has occurred. Retry the operation at a later time. If the problem persists, contact Microsoft support for further assistance.        
    Error code: 4025
    4. Below is the error in Event Viewer.
    Unable to access the data from Data Management Gateway: Query timeout expired. Please check 1) whether the data source is available 2) whether the gateway on-premises service is running using Windows Event Logs.
    5. This is the correlational id for latest refresh failure
    is
    f9030dd8-af4c-4225-8674-50ce85a770d0
    6.
    Refresh History error is –
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: The operation has timed out. Errors in the high-level relational engine. The following exception occurred while the
    managed IDataReader interface was being used: Query timeout expired. 
    Any idea what could have went wrong suddenly, everything was working fine from last 1 month.
    Thanks,
    Richa

    Never mind, figured out there was a lock on SQL table which caused all the problems. Once I released the lock it PowerPivot refresh started working fine.
    Thanks.

  • Unable to refresh SQL Server data source through Data Management Gateway

    I just installed the version 1.1.5226.8 of Data Management Gateway and tried to refresh a simple query on a table connected to SQL Server, with no transformations in Power Query.
    This is the error I obtain:
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: transfer service job status is invalid.
    I am wondering whether my Power BI is still not updated to handle such a connection type, or there could be something else not working?
    I correctly created the data source in admin panel following instructions in Release Notes, and
    test Power Query connection is ok.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

    I made other tests and I found important information (maybe there is a bug, but read the following).
    The functions DateTime.LocalNow and DateTime.FixedLocalNow
    work correctly, generating these statements to SQL Server:
        convert(datetime2, '2014-05-03 06:37:52.1135108') as [LocalNow],
        convert(datetime2, '2014-05-03 06:37:52.0525061') as [FixedLocalNow],
    The functions DateTimeZone.FixedLocalNow, DateTimeZone.FixedUtcNow,
    DateTimeZone.LocalNow, and DateTimeZone.UtcNow
    stop the scheduled refresh with the error I mentioned
    in my previous messages, generating these statements to SQL Server:
        '2014-05-03 06:37:52.0525061+02:00' as [TZFixedLocalNow],
        '2014-05-03 04:37:52.0525061+00:00' as [TZFixedUtcNow],
        '2014-05-03 06:37:52.1135108+02:00' as [TZLocalNow],
        '2014-05-03 04:37:52.1135108+00:00' as [TZUtcNow]
    I solved the issue by placing the DateTimeZone calls after a Table.Buffer call, so query folding does not translate in SQL these functions. However, it seems like something to fix.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

  • Change of sign when running the data management package

    Hi All,
    I have written a script logic for copying my Budget member under category dimension to say another member say "XYZ" under category dimension and this is for my Member Sales under ACcount dimension having acctype as inc.
    When i run the logic thru default i am getting the correct value. however if i run the same thru data manager package- my sign changes.
    Eg. 100000 is copied as -100000 and -50000 is copied as 50000.
    Please help me out
    Regards
    Navin

    Hi Navin,
    CREDITPOSITIVE= YES | NO
    Default: YES
    If No, all amounts referring to an ACCOUNT type (LEQ, INC) will have their signs reversed.
    Since you are referring to the member account sales as INC the default logic considers the sign as reversed.
    Please look at this link which helps you to understand
    Significance of Account Type EXP,LEQ,INC,AST in BPC- Need some valid reason
    Hope this helps.
    Regards,
    Sanjeev

  • Power Query/Data Management Gateway data refresh not working

    Pretty new to o365/Power BI, but here's what I've got going on (hopefully someone can help me out):
    I created a data management gateway and data source.  The data source says it's online in the PowerBI admin center, and it seems to be working correctly.  I can open Excel 2013 on my desktop (logging in as my trial o365 account which has Power
    BI associated with it) and connect to the data source via Power Query using the oData option.  I make sure to import the data into the model, and then open up the PowerPivot window and create a simple pivot table using the data in the model.
    that all works just great.  The problem comes when I upload the workbook and try to update it.  I've tried a few different ways.
    1. When I try to manually refresh the workbook by opening it in my o365 site and going to data-->refresh I get the following error:
          An error occurred while working on the Data Model in the workbook. Please try again.
          We were unable to refresh one or more data connections in this workbook.
          The following connections failed to refresh:
          Connection: Power Query - dbo_DimProductCategory
          Error: Out of line object 'DataSource', referring to ID(s) 'a75593f3-c34d-4f83-9458-49aa2cece164', has been specified but has not been used.
          The following system error occurred: Class not registered
          The provider 'Microsoft.Mashup.OleDb.1' is not registered.
          Power Query - dbo_DimProductCategory
    2. When I go into Power BI and go to "Schedule Data Refresh" for the workbook, I get the following error:
          Sorry, the data connections in this report aren’t supported for Scheduled Refresh.
          Technical Details â–¼
          Correlation ID: B3CE4B10-2137-E593-6FCF-189B73465190
          Date and Time: 03/31/2014 06:20:39 PM (UTC)
    Any help would be greatly appreciated.  If you need additional information, I'd be happy to provide it.

    Hey Guy,
    Thanks for the reply.
    The "data source type" of the data source in Power BI is "SQL Server" (there was only that and Oracle available to select from)
    The "data source type" that Power Query is using in Excel is "From oData Feed"....which is using the Power BI data source....which is using the data management gateway.
    A few follow up questions if you have a second
    1. Do you know when PowerQuery data refresh will be supported? (just a general idea....weeks, months, next year?)
    2. Is there any other way to connect to (and be able to refresh) PowerBI data sources referencing "on prem" data via the data management gateway?  I tried using the oData feed URL in non-PowerQuery areas of excel (excel data tab, PowerPivot directly)
    but it didn't work.  If there's some other way to connect to and refresh on prem data, i'm all ears :D 
    Really appreciate your help, thanks for taking the time.

  • Cannot Delete Articles from Data manager..

    Hi Guru's,
    WHile delting record s from Data Manager I am getting Error
    " Insufficient disk space available on DBMS" and as a result i cannot delete reocrds..
    What could be the issue?
    Please let me know if ican take help of our Basisi admin team .. Or Is there anything that i can work out in MDM..
    Regards,
    Vikrant M Kelkar..

    Hi everyone,
    Got it .. It was isseu because the table space was full and it ddnt allow me to further delete articles..
    Increased Table space and its ALl Good now..
    Thanks all ( whoen=ver reads this thread)
    Regards
    Vikrant M Kelkar

  • Is there a Java utility class to help with data management in a desktop UI?

    Is there a Java utility class to help with data management in a desktop UI?
    I am writing a UI to configure a network device that will be connected to the serial port of the computer while it is being configured. There is no web server or database for my application. The UI has a large number of fields (50+) spread across 16 tabs. I will write the UI in Java FX. It should run inside the browser when launched, and issue commands to the network device through the serial port. A UI has several input fields spread across tabs and one single Submit button. If a field is edited, and the submit button clicked, it issues a command and sends the new datum to the device, retrieves current value and any errors. so if input field has bad data, it is indicated for example, the field has a red border.
    Is there a standard design pattern or Java utility class to accomplish the frequently encountered, 'generic' parts of this scenario? lazy loading, submitting only what fields changed, displaying what fields have errors etc. (I dont want to reinvent the wheel if it is already there). Otherwise I can write such a class and share it back here if it is useful.
    someone recommended JGoodies Bindings for Swing - will this work well and in FX?

    Many thanks for the reply.
    In the servlet create an Arraylist and in th efor
    loop put the insances of the csqabean in this
    ArrayList. Exit the for loop and then add the
    ArrayList as an attribute to the session.I am making the use of Vector and did the same thing as u mentioned.I am using scriplets...
    >
    In the jsp retrieve the array list from the session
    and in a for loop step through the ArrayList
    retrieving each CourseSectionQABean and displaying.
    You can do this in a scriptlet but should also check
    out the jstl tags.I am able to remove this problem.Thanks again for the suggestion.
    AS

  • Data Manager Packeage and Process chain si not working

    Hi All,
    I executed a data manager package which contain a process chain to revaluate the one of my Account dimension meneber Say  "Revenue". I am working on BPC NW 7.0
    steps I followed:
    1. I created a script logic file and created a custom process chain.
        process chain steps:
      a) Start variant
      b) Modify dynamically
      c) Run Logic
      d) Or and Clear BPC tables
    2. This process chain was included in data manager package.
    3. Data manager package was modyfied to include parameters and scipt logic file name.
    4. executed data package
    The issue is " when I execute Data manager Package" I dont get any error but when I View status I dont see any pachage running or completed. If I see Process chain, It is failing at first step of Modify dynamically..no clue?
    Could you please let me know what could be a issue?
    Cheers,
    SAC

    I encounter this problem, Do this Steps:
    I. First,check if your process chain is existing in the Library.
    II.If yes,follow the steps below:
    1. Edata - organize Package - Modify your Package.
    2.Check if you had the correct process chain.
    3.IF yes, Click View package at its right side.
    4.Expand the Task Folder and take note of the Task Name (e.g. ZBPC_PROT_RUN_LOGIC)
    5.click Advance,Compare the task name that you noted in the syntax TASKS
    (e.g. TASK(ZBPC_PROT_CF_RUN_LOGIC,SUSER,%USER%)
    6. It should be the same.
    Running package but not appearing any status happens when the system cannot find your process chain.
    hope this helps,

Maybe you are looking for