File Reference Data

HI!
Is there a way to access a file reference data?, or do I have
to upload de file to the server and the download it in order to get
the file data (This sounds so much as overkilling).
Also, is there a way to store the file reference data for
future use, lest say I wish to keep the file reference data so my
flash applicaction can access the same file in the future, can sore
this, lest say in a server database or something?
I hope people form adobe could answer this :)
Thanks

Peter,
Thanks for taking your time to discus about this issue. I've
always liked the approach flash has had to protect personal data,
or to be more specific to completely separate Internet form the
local PC environment. And the proof that you guys are on the right
track is that there are no FLASH viruses yet.
Still, and if it's possible I would like to make a few
suggestions that shouldn't affect the security model but
definatelly improve the way FLASH communicates with the server:
1.- Access a File Reference locally, I'm not talking about
accessing any file on the local PC but the one the user manually
selected for lets say an upload (Básicaly add an open command
to fileReference), this would be very usefull since will allow the
SWF to process the file before sending it, wich for example will
allow to change an image size before upload or to process a text
file and upload only the results of this process. I don't see how
this could be a security risk since you are able to do this with
the server and a huge everhead today.
2.- Been able to store the file Reference, again without
accessing the PATH of the file, but the file reference object
itself will be most useful to for lots applications, you don't need
to have access to a directory structure but be able to access a
file previously selected by the user. Just a while ago I wrote a
flex application that uploaded a single file to a server every
hour. I was able to store in a shared object all the paramenters a
user selected for the upload except for the file reference. This
meas that the user has to select the file every time it opens the
application.

Similar Messages

  • Same data wire (source and sink are listed as "datalog file reference of") will not connect

    I am trying to write a short program that will convert a string spreadsheet file into a datalog file with the numbers turned back into numbers.  I have a case structure for each type of spreadsheet (different types per columns and different number of columns).  I use a 'for loop' to process all records (rows).  And I use a 'sequence structure' to individually process each column item into its original format (string or number), then group together into a bundle and then write the results to the datalog file format.
    The first two (case structures) worked but the second pair of case structures would not allow me to connect the datalog file reference number from the 'write to datalog file' to the 'close datalog file' through the 'for loop' wall.
    I am using 8.2
    I am sure it is an stupid issue, but the error message just did not point me towards a solution.
    con-fUsed, DogFace.
    Attachments:
    PROCESS SH TO DLOG.vi ‏84 KB

    The problem is that you are creating a different type of datalog file in cases 2 and 3 than you are in cases 0 and 1.
    The structure of a datalog file is the same as an array of clusters. All of the clusters must be of the same structure. Only the values of the individual cluster elements can vary.
    You are trying to write three different cluster structures to a single datalog file. You just can't this.
    You'll either need to write to three different files, or come up with a structure that everything will fit into.
    One other quick hint. NEVER use a variable to pass data when you can wire the terminal directly to the function.
    Ed
    Message Edited by Ed Dickens on 03-19-2007 11:39 AM
    Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
    Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.

  • Crystal Reports XI String [255] limit with the File System Data driver...

    I was trying to create a Crystal Reports XI report to return security permissions of files and folders.  I have been able to successfully connect and return data using the File System Data driver as the Data Source; however the String limit on the ACL NT Security Field is 255 characters.  The full string of data to be returned can be much longer than the 255 limit and I cannot find how to manipulate that parameter. 
    I am currently on Crystals XI and Crystal XI R2 and have applied the latest service packs but still see the issue.  My Crystal Reports Database DLL for File System data ( crdb_FileSystem.dll ) is at Product Version 11.5.10.1263.
    Is it possible to change string limits when using the File System Data driver as the Data Source?  If so, how can that be accomplished.  If not, is there another method to retrieve information with the Windows File System Data being the Data Source?  Meaning, could I reach my end game objective of reporting on the Windows ACL's with Crystal through another method?

    Hello,
    This is a known issue. Early versions you could not create folder structures longer than 255 characters. With the updates to the various OS's this is now possible but CR did not allocate the same space required.
    It's been tracked as an enhancement - ADAPT01174519 but set for a future release.
    There are likely other ways of getting the info and then putting it into an Excel file format and using that as the data source.
    I did a Google search and found this option: http://www.tomshardware.com/forum/16772-45-display-explorer-folders-tree-structure-export-excel
    There are tools out there to do this kind of thing....
    Thank you
    Don
    Note the reference to msls.exe appears to be a trojan: http://www.greatis.com/appdata/d/m/msls.exe.htm so don't install it.
    Edited by: Don Williams on Mar 19, 2010 8:45 AM

  • A File Reference and its evolving life!

    Hi all,
    I've noticed something that came as a little bit of a surprise to me, but I think I have the explanation, at a hand-wavy higher level anyway.  What I have not established is if this is a 'bug' or a 'feature', and if there are any ways the following issue can be avoided at the NI function/api layer.
    Consider the file open and file close function.  You open a file, you use the reference to the file to write/read data, then at some point you close the reference and the close function spits out the file-path.  Here are a couple of tid-bits you may not be aware of (that are easy to test):
    Q1) After your application opens/creates a file and starts using the file-reference to make file writes, if an external source changes the file-name of that file... guess what will happen on your next write function call?
    A1::  The write successfully updates the newly re-named file with your new data without producing an error or a warning.  (At least this is the case if your program is running on a vxWorks cRIO target and the file-name is changed directly on the cRIO via an FTP browser.)  
    Did this surprise you? It did surprise me!  -My handwavy explanation is that the file-pointer is perhaps managed/maintained by the OS, so when the OS tells the file-system to rename that file, the pointer that LabVIEW holds remains valid and the contents of the memory at the pointer location was updated by the OS.
    Q2) Continuing from the situation setup in Q1, after writing several new chunks of data to a file now currently named something completely different than when the file reference was originally created, you use the close function to close the file-reference.  What do you expect on the file-path output from the close function??  What do you actually get??
    A2::  The close function will 'happily' return the ORIGINAL file-name, not the actual file-name it has been successfully writing to(!).   This has some potentially significant ramifications on how/what you can use that output for.  At this point there is a ton of room for pontifications and more or less 'crazy' schemes for what one could do, but I argue that the bottom line is that your application has at that point completely lost the ability to accurately and securely track your file(s).  Yes, you could list a folder and try and 'figure out' if your file-name was re-named during writing and you can in various ways make more or less good 'guesses' on which file you in reality just had open, but you can never really know for sure.
    So, what do you guys think?? Is the behavior of returning the (incorrect) original file-path when you close the handle a BUG or a FEATURE??  Would it not be possible for LabVIEW to read back the data contained in the (OS?) pointer location and as needed update the file out path data when it closes a reference?  Should we not EXPECT that this would be the behavior?
    Q3)  Again, continuning from the above situation, lets assume we are back at the state in Q1, writing data to a (re)named file.  What happens if the file is deleted by an external process? What happens to the file reference? File function calls using the reference?
    A3::  This one is less surprising.  The file reference remains 'valid' (because it is a valid reference), but depending on the file function you are calling, you will get error such as error 6 (binary write reports this), or error 4 (a TDMS write will report this error), etc.  So as long as you don't rely on file ref-num tests to establish if you are good to go with a file-write or file-action, you should be safe to recover in an appropriate way.. Just don't forget to close the file-reference, even if the file is 'gone', the reference will still remain in memory until you 'close' it (with an error)(?I might be wrong about this last part?)
    I am not sure if the above is possible on e.g. Windows, Windows would probably prevent you from re-naming a file that has an open file-handle to it, but this is definitley observable on at least vxWorks cRIO targets.  (I don't have PharLap ETS or RTLinux devices so I can't test on those targets.. if you want to test its pretty straigth forward to make a simple test app for it.)
    [begin rant-mode related to why I found this out and why this behavior BITES]
    There are situations where the above situation could cause some rather annoying issues that, for somewhat contrived reasons related to cRIO file API performance, CPU and memory resource management, are non-trivial to work around.  for example, using the NI "list folder" to listing folders take a very hefty chunk of time at 100% cpu that you cannot break up, so polling/listing folders after every file update (or even on a less regular interval) is a big challenge, and if you are really unlucky (or didn't know any better) and gave the list command in a folder with 1000's of files (as opposed to less than about 100 files), the list will lock your CPU at 100% for 10's of seconds...  Therefore, you might be tempted to maintain your own look-up table of files so that your application can upload/push/transfer and/or delete files as dictated by your application specific conditions... except that only works until some prankster or well-intention person remotes in and starts changing file-names, because then your carefully maintained list of file-names/paths' suddenly fall appart.
    [\end rant]
    QFang
    CLD LabVIEW 7.1 to 2013

    Hey guys, thanks for turning out your comments on this thread!
    -Deny Access : still able to re-name (and delete) the file via FTP browser (didn't test other file avenues).  I think this is for the same reason that NI vxWorks targets (such as cRIO-9014) do not support the concept of different users with different rights, as such, everyone have access rights to everything at the OS level.  Another issue for me would be that "Deny Access" does not work on TDMS file references, so even if it worked, it would not help me.
    --> I strongly suspect that these things are non-issues or issues that can be properly managed, on the new NI LinuxRT targets since (the ftp is disabled by default) it supports user accounts and user restrictions on files/folders.  The controller could simply create the files in a tree where 'nobody else' has write access.
    Obviously nobody should mess around with files on a (running) cRIO, but customers don't always do what they are supposed to do.   
    As far as the 'resources' or overhead to update the file-refnum with the new information, this would not be needed to be done in a polling fashion, simply, when the file-close function is called, as part of that call it updates its internal register from the pointer data, so this should be a low overhead operation I would think?  If that is a true concern, a boolean input defaulting to not updating or a separate 'advanced close' could be created?
    I've included a zip with the LV2013 project and test VI's (one for tdms one for binary) that I've used. nothing fancy, but in the interest of full disclosure.  The snippet is the 'binary file' test vi, in case you just want a quick peak:
    Steve Bird's findings of (yet) another behavior on Pharlap systems is also very interesting, I think!!
    [EDIT]  JUST TO CLARIFY, on vxWorks, the re-named file keeps being successfully written to, unlike the PharLaps' empty file that Steve Bird found.
    QFang
    CLD LabVIEW 7.1 to 2013
    Attachments:
    cRIO Tests.7z ‏30 KB

  • How can I create dynamic file references in Power Query?

    Hi all,
    I'm new at using PowerQuery, and so far I like it. There's one thing I am struggling with though... Once I have set up my PoweQuery connections, I don't find an easy way to change the file to which the query is connecting. I'm using it for a monthyl recurring
    process, and every month the source data to query on woudl be different. The same in format/structure, but just a different dataset.
    Is there a way to make the source setup more dynamic? Can I for example in a parameters sheet enter the name and path of the new source file and update the queries?
    Currently the Advanced editor shows me following file reference:
    let
        Source = Excel.Workbook(File.Contents("Z:\Templates\EMEA\Source Data Tables\EMEA_EW_Source_Data_for_Power_Queries v1.xlsm")),
    Thanks in advance for suggestions

    Yes, this is something that you can do with Power Query. Here's how you can do it:
    Create a table in Excel containing your parameter value. Let's say that it has one column, called ParameterValue, and one row.
    Create a new Power Query query that gets the data from this table. Call the query something like ParameterQuery.
    In your original query you will now be able to reference values from your parameter query by saying something like this:
    Source = Excel.Workbook(File.Contents(ParameterQuery[ParameterValue]{0})),
    HTH,
    Chris
    Check out my MS BI blog I also do
    SSAS, PowerPivot, MDX and DAX consultancy
    and run public SQL Server and BI training courses in the UK

  • Error while executing planning function with reference data

    Hi,
    I have a two planning functions one is used to upload the file (with out reference data checkbox in planning function RSPLF1) and other planning function ('Referece data'check box is selected in custom planning function RSPLF1) to execute the logic of creating new record along with the flat file data.
    Following data is uplooaded
    Company code | Profit_ctr | calmonth | Amount
    1000                 | 50000      | 01.2011  | 150
    Cube data
    Field1    |  Company code | Profit_ctr | calmonth | Amount
             |  1000                 | 50000      | 01.2011  | 150
    Z1         |  1000                 | 50000      | 01.2011  | 150
    Now I want to change the value from 150 to 200 and when I try to execute with the following data, it is giving dump 'a row with the same key already exists'.
    Company code | Profit_ctr | calmonth | Amount
    1000             | 50000          | 01.2011  | 200
    Ideally in the second execution it should append the new row with Amount value 50 to cube which is the delta value.
    I debugged the issue and found that I_TH_REF_DATA has following data and C_TH_DATA also contains the same records.
    Field1     Company code | Profit_ctr | calmonth | Amount
    #     1000                 | 50000      | 01.2011  | 150
    Z1     1000                 | 50000      | 01.2011  | -150
    Z1     1000                 | 50000      | 01.2011  | 150
    Due to this, record which already exists in C_TH_DATA and trying to append new record with the same combination is failing.
    C_TH_DATA should only contain the source data of Amount 200, but not sure why reference data is coming in C_TH_DATA.
    Could anyone please guide me on how the reference data is getting populated in C_TH_DATA ?
    Thanks in advance
    Edited by: peppy on Aug 3, 2011 5:00 PM
    Edited by: peppy on Aug 3, 2011 8:37 PM

    Hi Peppy,
    C_TH_DATA is hashed table!  According to your post you are trying to append to C_TH_DATA and this results in a dump. Please take a look at the standard planning function to see how SAP is programming the planning functions. E.g. in CL_RSPLFC_REPOST method IF_RSPLFA_SRVTYPE_IMP_EXEC~EXECUTE you can find the following code:
      CREATE DATA l_r_data_wa LIKE LINE OF c_th_data.
      ASSIGN l_r_data_wa->* TO <s_data_wa>.
      CREATE DATA l_r_new_wa LIKE LINE OF c_th_data.
      ASSIGN l_r_new_wa->* TO <s_new_wa>.
    LOOP AT c_th_data INTO <s_data_wa>.
    <s_new_wa> = <s_data_wa>.
    now the SAP code changes the  values, you can do it your way here
    and than write the changes back
            MODIFY TABLE c_th_data FROM <s_data_wa>.
    ENDLOOP:
    Another option is to use the READ statement to check if the record is already in the table. If not, you can use MODIFY otherwise you use INSERT. So you get something like this:
    READ C_TH_DATA from <s_data_wa> transporting no fields.
    if not sy-subrc EQ 0.
      INSERT <s_data_wa> into table C_TH_DATA.
    else.
      MODIFY TABLE c_th_data FROM <s_data_wa>.
    endif.
    Depending on your requirements you can also use the collect  statement.
    If c_th_data shows the reference data as well, you may need to adjust the filter to restrict it to the correct values.
    Hope this helps.
    Best regards
    Matthias Nutt
    SAP Consulting Switzerland

  • Inserting automatic file name, date and time in Illustrator CS3?

    hello all,
    is there an option in Illustrator where you could automatically insert inside your file the file name, date and time where it was last created or modified, etc.
    I found this option to exist in InDesign, but could not find the same menu in Illustrator. I am using CS3.
    help?
    thanks a bunch.

    Mario,
    would you please be so kind to include a reference on your site about the script? As far as I can see, the PutDateandTime.js you are providing here is a copy or a slightly modified copy of the original script that Wolfgang Reszel has provided on our site for about four years.
    It is okay that the script is available on your site, but please not without a reference. Thanks.

  • Import Comments data and Dimension Members from csv file via Data Manager

    Dear Experts,
    I have two questions regarding the data manager.
    Q1.Is it possible to import "Comments" from the csv file via Data Manager?
    We'd like to import the amount with "Comments".
    My image of csv file is like below;
    ACCOUNT,CATEGORY,TIME,ENTITY,INPUTCURRENCY,AMOUNT,COMMENTS
    1100000,ACTUAL,2010/06,LC,30000,This is comment
    Q2.Is it possible to import the dimension "members" from the csv file via Data Manager?
    We have a user-defined dimension named "Project"
    and would like to import the members, instead of maintaining them in BPC administration manually.
    I found an online help information which says "Import Master Data from a Data File Example",
    but I could not find any relevant sample package for this.
    (I tried to import the members by using "Import" package, but it failed...)
    reference:http://help.sap.com/saphelp_bpc75/helpdata/en/86/8b1bfc12c94fb0b585cca70d6f1b61/content.htm
    Thanks in advance for your help.
    Fumi

    Hi Fumi,
    In this case, I would suggest you to create a customized SSIS package which will fill-in the "Comment<APP>" table, according to the csv file you have. I do not know any standard package that allows you to import comment the way you would like...
    Best Regards,
    Patrick

  • Student file & master data change according to authority

    Deal all,
    I have following requirements about student file and master data that certain IT are not displayed when i do not have authority.
    for example, if A student is belong to my department, IT1702 must be dislayed but student B is not belong to my department IT1702 must be disabled or must dissappear.
    if i can adjust student file & master data according authority (student belong to my department or not) it would be very nice.
    regards,
    jin dal

    Hi,
    The authorizations checks in Campus Management consist of the basic authorization and the structural HR authorization.
    The basic authorization determines whether the user is allowed to execute a certain function, while the structural authorization determines the objects for which the user is allowed to execute this function. In other words, the basic authorization defines what function the user is allowed to use, and the structural authorization defines for which objects the user is allowed to use this function.
    For example, the basic authorization can define that the user is allowed to perform the create module booking activity. With the structural authorization you can restrict this activity only to modules offered by the faculty of Mathematics, for example. (The user can then access these modules whenever required; see also Structural Authorization).
    Basic Authorization
    In release CM 4.64, three authorization objects are used in Campus Management:
    At the first level is the transaction code check. The system performs this check each time the user starts a transaction using the menu or command line. For this check to be successful, the user requires an authorization for the relevant transaction code in the authorization object S_TCODE.
    At the second level, the Campus Management function is divided into two parts. The first part includes activities such as create request, create registration, create re-registration, cancel module booking, and so on. The second part covers master data like student master data and a major part of the academic structure.
    When checking the authorizations for master data, the system uses the HR authorization object PLOG for master data authorization checks. A new authorization object ( P_CM_PROC) has been implemented for activities in release CM 4.64. The system now only checks whether the user is authorized to use the activity. It no longer checks if the user is authorized to read or change the data in this activity. The new authorization concept has the following advantages:
    It simplifies authorization assignment. The system no longer uses the comprehensive data model with its many objects and object interrelationships as the basis for the activity authorization (authorization assignment via authorization object PLOG);
    Changes in the data model have no effects on the authorization checks for activities;
    It is now possible to distinguish between create and change operations, for example in re-registrations;
    You can now distinguish between re-registrations and leaves of absence.
    The table T7PIQPROCESS (Activities) contains all Campus Management activities. The system performs authorization checks for all activities with the exception of the ones listed below.
    Authorization checkes are not performed for the following activities:
    AC10 (Send Reminder for Outstanding Payments)
    HSMA (Create Status Indicator Manually)
    PR11 (Create Applicability List Automatically)
    These activities do not contain any activity-related authorization checks.
    In the standard system, the authorization check for activities is independent of the objects for which the activities are performed, and of their attributes. (The structural authorization only restricts the objects which the user can then process irrespective of the activity.). If you require additional checks, you can use the business add-in HRPIQ00AUTHORITY.
    Structural Authorization
    The structural authorization enables you to define the set of objects the user is authorized to process. You determine these objects using evaluation paths. You can define whether the user should only be given a display authorization for these objects or a maintenance authorization as well.
    You cannot combine the structural authorization with the basic authorization. The user is therefore authorized to process the assigned set of objects irrespective of the function (s)he is currently using.
    Further notes
    As functions from other applications areas (Training and Event Management, Notification Processing) and from Student Accounting are integrated in Campus Management, users also need authorizations from these areas.
    Campus Management contains a number of roles which you can combine with the roles of other application areas to create composite roles. You can either assign a composite role or individual roles to users.
    Component Prefix of the roles provided
    Campus Management SAP_CM_
    Training and Event Management SAP_HR_PE
    Notification processing SAP_CA_NO_NOTIF
    Student Accounting SAP_FI_CA_
    You create the business partner authorizations in separate IMG activities which you can find in Customizing for Campus Management in Campus Management Master Data -> Students -> Students as Business Partners -> Basic Business Partner Settings -> SAP Business Partner -> Business Partner -> Basic Settings -> Authorization Management.
    In the SAP Reference IMG under Basis Components -> System Administration -> Users and Authorizations, you can find more IMG activities in which you can make general settings for authorizations.

  • Invalid tdms file reference

    I have a program that logs data with a tdms file.
    On my pc it works fine. When I build an exe and instal it on an other pc, I get the error:
    "invalid tdms file reference"
    Is this a problem with my vi or with the pc? 
    Attachments:
    Stikstofdroger.vi ‏50 KB

    I'm wondering about that this works at your development machine.
    You should think about attending a LabVIEW Style Guidline and LabVIEW Performance class.
    But to fix you issues regarding the tdms file path you have to change a few things:
    1. There is only a valid path when "Datalog OFF/ON" is set to true before you start the vi. You should change this by creating a new event case for the "Datalog OFF/ON" where the user then is able to start datalogging when the vi is allready running.
    2. When "Show Measurements" is set to False you write an emtpy path constant into the shift register. Why? Either you wire the path from the left tunnel to the right or you think about if it is even necessary to wire the path through the case structure at all.
    However, I would still recommend you to completely redesign your vi!
    Christian

  • The file reference 0x3000000003d66

    Hi,
    My hard drive has performed recently a chkdsk at boot. Since then, I cannot find an excel file named mdp.xlsx.
    I have the following lines appearing in the bootex.log created after completion of chkdsk:
    The file reference 0x3000000003d66 of index entry mdp.xlsx of index $I30
    with parent 0x39e8 is not the same as 0x1000000003d66.
    Deleting index entry mdp.xlsx in index $I30 of file 14824.
    The file reference 0x3000000003d66 of index entry MDP~1.XLS of index $I30
    with parent 0x39e8 is not the same as 0x1000000003d66.
    Deleting index entry MDP~1.XLS in index $I30 of file 14824.
    I was wondering whether this would help me. Thanks for letting me know what is the best path to follow to recover this file.

    Excel Recovery Toolbox is an easy application for the restoration of corrupted XLS files on any PC compatible hardware. This program can easily fix data corruption threats in damaged tables and
    offers the following to its users:
    The small size of Microsoft Excel recovery tool download makes easier the start of Excel Recovery Toolbox on offline workstations;
    Recovers fonts of tables, number formats, worksheets, columns, columns width and rows height;
    Repairs cell data of workbook, all types of formula including functions, internal, external and name references;
    Fixes the color of cells and cell borders, line styles and alignment.
    For more information:http://www.excelrecoverytoolbox.com/
    Follow these steps:
    Click Start, click Control Panel, click Programs, and then click Uninstall a program under Programs and Features.
    Select Microsoft Office.
    Click Change, and then wait while the change and repair is carried out.
    Exit after the process is completed.
    Double-click the Excel file that you want to open or open the file through an HTML link as you could by using previous versions of the program.
    That’s Microsoft advice.
    If this method didn't help, you so can try to find the answer here:
    http://social.technet.microsoft.com/Forums/en-US/e47536ee-4458-44e3-a9e0-df48fc54d32d/excel-repair-backup-copy?forum=excel

  • Funny stuff with excel file references

    I used 'create a subvi' on a portion of a diagram that is to save some data
    using the excel toolkit. The subvi had a file reference input and output.
    But now when I run it the reference seems to get corrupted (or at least not
    work right). I tried deleting the input and output reference and got new
    copies from one of the supplied excel toolkit vis but it still wont work
    right. I can get it to work by not using the output reference and just
    wiring the following vis to the input reference. But I wonder how it is
    getting corrupted. Even if the reference is not used at all in the subvi
    but is just wired from input to output it still messes it up. Anyone know
    why this is happening?

    Hi Adam,
    You seem to find alot of interesting issues.
    If you have spare time, maybe you should sign up for the BETA test program.
    Please post an example, I'd love to see this one.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Import of Solution Manager reference data to ARIS

    Hi all,
    I would like to import SAP Solution Manager data to ARIS.
    Is it possible to perform that simply importing a SolMan file including reference data to ARIS or do we need to create an interface between SolMan and ARIS?
    We would like to do this in the most simple way as possible since it is for demo purposes only.
    So if you have any idea please let me know.
    Thanks,
    //Anders

    Anders,
    I have done similar demos in the past. based on my experience I am giving a quick step by step procedure for doing this.
    Create Project in SAP Solution Manager and define system landscape by choosing the
    SAP components based on the selected scenarios.
    (ii) Analyze and Select the content from the Solution Manager BPR based on the
    finalized Scenarios/Processes.
    (iii) Establish a Project Database on the ARIS Server and merge the As-Is Scenario/Process
    from the existing ARIS DB into this.
    (iv) Synchronize Solution Manager Project with ARIS and establish the link between
    ARIS Project DB and SolMan.
    (v) Perform analysis of the As-Is Process in conjunction with selected scenarios from Solution
    Manager reference content and define the structure and scope.
    (vi) Align the As-Is Process with the Reference Framework from Solution Manager and
    add/delete steps as required.
    (vii) The Business process architecture is adapted to the SAP Reference Content aligned to the
    structural layers of Solution Manager (Scenario, Process and Process Step).
    (viii) Synchronize the Business flows created in ARIS  with Solution Manager and update
    structure on both ends.
    Hope this helps you. Let me know if you need anyother information.
    Rgds
    Manish

  • Reference Data Imports from ECC

    All of a sudden our reference data imports from ECC to ESourcing are failing with this error.
                "is either unknown data type or it is missing the required DataType(..) declaration"
    They used to work.  Can someone point me towards the place to check for the issue?  There is a DataType Declaration at the top of the data file (see example) and the data file format has not been changed.
                    #DataType(masterdata.Currency)
                    "DISPLAY_NAME","DOCUMENT_DESCRIPTION","IN_EURO","DISPLAY_PRECISION","STORAGE_PRECISION"
                    "ADP","Andoran peseta" "","0","0"
    Thanks,
    Keith
    Edited by: Keith Wendel on Nov 11, 2009 11:50 AM
    NOTE - I changed brackets to parantheses in the two indented lines above to make them stop appearing as html links.

    Turned out that there were unprintable characters at the front of the files...caused by the encoding used in the source ECC system ABAP program.

  • Get associated avi file from avi file reference

    For normal bytestream files, there is a "refum to path" function (under File I/O -> Advanced File Functions) that will give the associated file path for the input file reference.  This does not seem to work when an AVI file reference (opened using IMAQ's AVI functions) is wired ("refnum to path" returns <Not a Path>).  Is there a way to extract the AVI file path from an AVI file reference?

    Thanks for your help.  The design of my program involves streaming video to disk (along with multiple other parallel functions, including DAQ interfacing and collection, playing back AVIs in mediaplayer, etc.).  Because of this, I have multiple "engine" loops running in parallel to the user interface loop.  In the UI loop, the target of the streaming video can be changed or even turned off, so I have a LV2 functional global holding the AVI file reference, which the video streaming engine checks for validity before proceeding through a loop iteration.  However, in the engine, certain decisions about extra data that needs to be saved with the streaming video is made by checking the directory of the file it is instructed to write to.  Therefore, a way to extract that file name from the reference in the LV2 global would be usefull.  The workaround I have to use now is to store the file path as well as the AVI file reference in the LV2 global and pass them around together.  This isn't that bad, but in the intrest of keeping block diagrams uncluttered with data I shouldn't need, I posted the question. 

Maybe you are looking for

  • Cannot send email just get a 'relaying' error messgae

    Hi Apologies if this has already been asked:- i have an iphone 4 and have upgraded to the latest version of ios.  But since then I cannot send emails from either my aol or hotmail account. I just get this error message:- A copy has been placed in you

  • Error while creating new Movement Type

    All SAP Gurus, We are creating a new Z movement type by copying the 101 movement type (T code: OMJJ). But system is not allowing to create a new movement type and giving an error message as 'Specify the key within the work area'. We are unable to cre

  • Patch 13418800 error

    while applying patch 13418800 on e-business 12.1.3 I encountered the following situation The two filenames mentioned below were working for more than 18 hours Worker  Code      Context            Filename                    Status      1  Run       G

  • IWeb crashes all the time

    Whatever you do in iWeb, it crashes al the time. Moving another page, it crashes. Adding a new page, it crashes. etc. HEY APPLE, DO SOMETHING ABOUT THIS QUICK. A PAID FOR PROGRAM SHOULD NOT BEHAVE LIKE THIS!!!

  • Trying to find the init.ora parameter..

    Hi, I am trying to locate the init.ora parameter under the Oracle Database. I was under the impression, that it would be present in the same location as tnsnames.ora, but Ican see that it's not. Can someone please advise the right location to look fo