Flat File Profiles vs Auth.Analysis : limited flexibility

Hi,
How can I combine values of different characteristics - that belong together from a auhtorization point of view - into one profile?
The subject of this post may not be very clear, but let me explain what I need ...
Suppose I have a user that may only display data in MultiProvider 'MP1', for country 'BE' and company code '1000'.
That same user may also plan for country 'BE', but only for company code '2000' in MultiProvider 'MP2'.
You could maintain this manually by having 2 analysis authorizations:
analysis 1:
0TCAACTVT EQ '03'
0TCAIPROV EQ 'MP1'
0COUNTRY EQ 'BE'
0COMP_CODE EQ '1000'
analysis 2:
0TCAACTVT EQ '03' '02'
0TCAIPROV EQ 'MP2'
0COUNTRY EQ 'BE'
0COMP_CODE EQ '2000'
This is possible to achieve when you use t-code rsecadmin, maintain the values and assignments to users manually.
But we have about 100 users, all with different values they are authorized to (partially display, partially changeable). It goes without saying that this is not possible to maintain manually. Therefore we would like to continue the old way: loading authorizations to DSO's and generate the profiles afterwards.
How can I load files in the proper way that the right values are validated by the system together? This means that the system will allow the user to change/write data for country 'BE' and company code '2000', while the user will only be able to display data for 'BE' for other company codes (let's assume that the multiprovider does not matter in this case).
I hope someone can help here, it's been a brainteaser for a while now.
Please recall that it should be able to achieve this via flat file uploads!
Thanks in advance!
Kind regards,
Bart

Hi Andreas,
Thanks for this information.
The generation of authorizations itself is not the issue.
We want to combine values of different InfoObjects in the same 'generated profile'.
The combination of other values for the same selected InfoObjects should be generated in another 'generated profile', as described in the initial post?
Any idea how this can be managed?
Thanks in advance!
Kind regards,
Bart

Similar Messages

  • How to handle reference Flat File for Dimentions in Tabular model

    I have 2 flat files, which are needed in creating one Country dimention.  
    In SSAS OLAP cube, I would import these 2 files to 2 staging table. I would ETL to one DimCountry table on DW.
    How to do with Tabular model?
    Should both Files imported to Tabular model and then Insert new Columns and DAX to create needed relationships?
    Kenny_I

    Hi Kenny_I,
    According to your description, you want to know how to import flat files into SQL Server Analysis Services tabular model, right?
    In your description, you said that "you can import these 2 files to 2 staging table, then ETL to one DimCountry table on DW in OLAP cube." Based on my research, the flat files are not supported as the datasource of SSAS tabular model. So you can import
    these 2 files to 2 staging table, and import the tables into SSAS tabular model.
    Reference
    http://msdn.microsoft.com/en-in/library/gg492165.aspx
    http://msdn.microsoft.com/en-us/library/hh230968.aspx
    If I have anything misunderstood, please point it out.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Flat File automation process - limitations

    Hello Everyone,
    I would really appreciate any insight on the process improvement suggestions.
    Background:
    Currently we have around 12 territories providing a new flat file with new data on a daily basis depending on the business activity. Which would also mean that, on a given day if there is no activity would mean no flat file provided to BI for loading process.
    The flat files provided need to be loaded into the BI system (PSA - DSO - InfoCube).
    The flat file loading process has been automated for the daily file by implementing the logical file name for each territory.
    1. The process variant in the process chain is to ensure if the flat file is available on the App server (Custom ABAP program).
    2. 12 InfoPackages have been created to pick the data from the flat file on the app server and load the data over into the PSA.
    3. All the InfoPackages merge into an "AND" event in the process chain before the DTP load into the DSO kicks off.
    4. DSO Activation
    5. Recon between the flat file and the DSO to ensure all the data from flat file has been loaded into the DSO.
    6. DTP loads into the InfoCube.
    7. Recon between the InfoCube and the DSO itself.
    8. Moving the flat file from one folder into another.
    All the above processes are automatically performed without any issues if the flat file is available on the server.
    Problem / Issue:
    As one of the major limitations of the above design is the flat file for sure needs to be made available on the app server in order for the whole data flow in the process chain to continue without any breakpoints.
    Current workaround / process improvement in place:
    Based on the above limitation and upon further research, I was able to apply the OSS Note to give us the option of maintaining multiple DTPs for the same data target with different filter values.
    So, even if have individual data stream for each territory with a different DTP the issue still remains where the process variant (ABAP program to check if file exists) or the InfoPackage load if the ABAP program is removed will fail.
    Which means due to the above fact, the support team is alerted about the process chain failure.
    Question / Suggestions required:
    The main questions or any suggestions would be welcome, if one of you can let us know an approach where the flat file check program doesn't have to give a hard failure in the process chain for the rest of the process chain to continue with the loading process. (As in order for the rest of the process chain to continue the only options we have are Error, Success, Always).
    I have also looked into the Decision process variant available in the process chain, but based on the options available within I cannot utilize it for the loading process.
    Error can be caused by generating an error message in the ABAP program which in turn is causing the issue of alert being sent over even if the rest of the process chain finishes.
    Success would mean the flat file needs to be available. Always cannot be used in this case as it will cause a failure at the InfoPackage level.
    If the InfoPackage load can be avoided without a hard error to be generated, the process chain does not have to remain in the failed state which in turn will not trigger any alert to the support team.
    Please do let me know if you need more details about the above process improvement question.
    Thanks
    Dharma.

    The main issue with this as you mentioned is that the file has to be available for sure.
    We had a similar issue - we had a very critical data load that had to happen everyday , failure of the file getting loaded would mean that the reports for the day would be delayed.
    We were running on UNIX and we implemented a simple UNIX command that would not complete till the file was available in the app server.
    Something like
    while ( the file does not exist )
    delay of 15 seconds
    you will come out of the while only after the while completes which means that the file becomes available.
    You can write a similar ABAp program to check file availability if required and put it into your program.
    we also had a failover process where we created a zero byte file with the same name if the file did not come beyond a certain number of tries and the PSA would load zero records and the data load will continue.
    Edited by: Arun Varadarajan on Jan 26, 2009 10:18 AM

  • In LSMW flat file date format to be converted to user profile date setting.

    hi all,
      i got a flat file in which date is in mm/dd/yyyy format.i converted the data using lsmw but this conversion is valid only if user has set his date profile as mm/dd/yyyy in his user profile setting->own data. now if user has some other setting then it will give error. so how to convert date and how to do mapping with same . please help.

    Sunil,
    use below fm to get current user date format:
    data: idate TYPE sy-datum,
              tdat8 type string.
    CALL FUNCTION 'DATUMSAUFBEREITUNG'
         EXPORTING
           IDATE                 = idate
         IMPORTING
            TDAT8                 = tdat8.
    Amit.

  • Interface output file : tab limited vs flat file with fixed length

    hey guys,
    any idea on difference b/w to file type : flat file with fixed length or tab limited file
    thanks

    Tab Delimited:
    Two Field are seperated by a TAB
    eg. SANJAY    SINGH
    First field is First Name and Second is Sir Name.
    Nth field will be after N -1 tab
    Fixed Length:
    Every field has a fixed starting position and length
    eg. SANJAY     SINGH
    Here First field start from Position 1 and has lenght 10 and 2nd field start from 11th postion and has lenght 10.
    Fixed Length -> The lenght of each field is fixed, while in tab delimited the lenght of field is not fixed but we know it ends when the Seperatot (Tab) is encountered.

  • How do I profile a Flat File?

    The Profiling feature in OWB does not allow you to pick your flat files. Can a flat file be profiled? If not, is the only alternative to load the flat file into the DB?
    Thanks.

    you have to load your flat file in a external table...

  • Reading huge flat file in OSB 11gR1

    Hi,
    I want to read a flat file in OSB.The size of the flat file may be larger, upto 1 MB.
    As per my knowledge, OSB provides following approaches to read a flat file-
    1.JCA(creating a file adapter in jdev and importing artifacts in OSB)
    2.MFL transformation
    3.Java callout
    Please let me know which is the best way to read the flat file.Also , is there any other way to do the same.
    Thanks in advance.
    Regards,
    Seemant
    Edited by: Seemant Srivastava on Feb 18, 2011 1:47 PM

    Which option is best one to convert a flat file to XML - is it via File Adapter or MFL ? Well, it's a topic of debate and it usually depends on your choice. Manoj has explained it clearly above that why one may prefer File Adapter over MFL. It also depends on your familiarity with the product. If you are a Oracle developer dealing with BPEL/Mediator mostly then you will prefer going for File adapter in this situation, even with OSB, but if you are a OSB developer (since the time it was known as ALSB) you will prefer MFL over adapter.
    It's just matter of choice & your comfort. Remember, in different-different cases, both solutions may result in different performance, so better test them from performance perspective and then choose.
    Such flexibility of optional tags can only be handled by mfl.I don't think so. File adapter should also be able to handle this use case. Have you checked this -
    http://download.oracle.com/docs/cd/E17904_01/integration.1111/e10231/nfb.htm#CHDDHEAI
    Also, upto what size of input file is supported by mfl.It's a unanswerable question. It totally depends upon your system structure. I personally don't prefer huge files translation at OSB because it hurts the performance of OSB.
    but I think such feature will not be supported when the file adapter artifacts are imported and used in OSB 11.1.1.3/11.1.1.4Correct. From OSB Dev guide -
    25.2.1.1 Oracle JCA Adapter Limitations
    Following are limitations when using some JCA adapters with Oracle Service Bus:
    •Oracle JCA Adapter for AQ – Streamed payload is not supported with Oracle Service Bus.
    •Oracle JCA Adapter for FTP and Files – Attachments (large payload support), pre- and post-processing of files, using a re-entrant valve for processing ZIP files, content streaming, and file chunked read are not supported with Oracle Service Bus.
    http://download.oracle.com/docs/cd/E17904_01/doc.1111/e15866/jca.htm#BABBICIA
    Regards,
    Anuj

  • Error while loading flat file into DSO

    Hi
    I am loading data from a flat file into a DSO. My fields in the flat file are Trans_Dt, (CHAR) Particulars (CHAR), Card_Name, (CHAR) Exps_Type, (CHAR)
    Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt, (CURR)
    0Currency (CHAR)
    In the proposal tab apart from the above mentioned fields 3 additional fields viz, field 10, field 11, and field 12 have come. How did these 3 additional fields come when I don't have any additional fields in my flat file? I've deleted these extra 3 fields though.
    When I activate the DataSource it is getting activated but then I get the message 'Data structures were changed. Start transactions before hand'. What does this message mean?
    When I hit the 'Read preview data' button it doesn't show me any data and gives me the error Missing reference to currency field / unit field for the fields Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt
    How do I create a reference field to the above mentioned fields?
    Earlier I didn't have the 0Currency field in the flat file. But in my DSO while creating the key figures by default the 0Currency field also got created which is quite obvious. Now while activating the transformations I was getting a message that 'No source field for the field 0Currency'. Hence I had to create a new field in my flat file called 0Currency and load it with USD in all rows.
    Please help me in loading this flat file into the DSO.
    Thank you.
    TR.

    Hi guys,
    Thanks a lot for your answers. with your help I could see the data in the 'Read preview data' and schedule the load. I did use all the Info objects in the info objects column of the data source to load the flat file.
    The data is in PSA successfully without any issues. but when I executed the DTP it failed with errors.
    Earlier there was no mapping from Currency field in source to the all the key figure fields in the target in the transformation. The mapping was only from Currency to 0CURRENCY but still the transformation got activated. As per your advise I mapped Currency field to the remaining Key Figure fields but then I am getting the error
    'Source parameter CURRENCY is not being used'
    Why is that so?
    list of Errors after executing the DTP:
    1. 'Record filtered because records with the same key contain errors'
    Message:
    Diagnosis: The data record was filtered out becoz data records with the same key have already been filtered out in the current step for other reasons and the current update is non-commutative (for example, MOVE). This means that data records cannot be exchanged on the basis of the semantic key.
    System Response: The data record is filtered out; further processing is performed in accordance with the settings chosen for error handling.
    Procedure: Check these data records and data records with the same key for errors and then update them.
    Procedure for System administration
    Can you please explain this error and how should I fix this error.
    2. Exception input_not_numeric; see long text - ID RSTRAN
    Diagnosis: An exception input_not_numeric was raised while executing function module RST_TOBJ_TO_DERIVED_TOBJ.
    System Response
    Processing the corresponding record has been terminated.
    Procedure
    To analyse the cause, set a break point in the program of the transformation at the call point of function module RST_TOBJ_TO_DERIVED_TOBJ. Simulate the data transfer process to investigate the cause.
    Procedure for System Administration
    What does this error mean? How do I set a breakpoint in the program to fix this error inorder to load the data?
    What does Procedure for System Administration mean?
    Please advise.
    Thank you.
    TR.

  • Saving query results to a flat file

    Hello Experts!
    We have a very specific issue on our current project and I would like to know if any of you have ever done something similar. We are taking query results from BW (after complex calculations, some based on SY-DATE) and saving them to flat files to transfer to a SQL database structure on the Enterprise Portal. From here, the portal team renders the information into more "static" dashboards that include mouse over features and even limited drilldown (i.e. no matter where a user clicks the report always drills down on profit center)
    There are many reasons why the model is set up as such (mostly training of executive level end users), and even though it doesn't mesh with the idea that BW could do this work on its own, we have to work with what we have.
    We have come up with 3 possible ways to save this data to flat files and hopefully someone can tell us which might be the most effective.
    1.) Information Broadcasting. If we broadcast XML files to the portal, will the portals team be able to read that data into a SQL database? Is there another way to use broadcasting to create and send a flat file to specific location?
    2.) RSCRM_BAPI. This transaction seems to not support texts.
    3.) Open Hub. In order to get the open hub to work, we first have to save all of our query results to direct write data store objects using APD. (calculations based on current date, for example, would require daily full loads to the underlying data providers otherwise.)
    What is the best way to accomplish this? Is information broadcasting capable of doing this?
    Thanks!

    Hi Adam,
    Do you have to use flat files to load the information to a SQL database? (if so maybe someone else has a suggestion on which way would be the best).
    I try to stay away from flat file uploads as there is a lot of manual work involved. May I suggest setting up a connection to your SQL table in the DBCON table and then writing a small abap program to push data into the SQL database (which you can automate).
    1) Use APD to push data into a table within BW.
    2) Go to transaction SM30 > table DBCON and setup a new entry specifying the conncection parameters to your SQL database.
    3) In SE38 Write an ABAP program along the folloing lines (assuming the connection you set in DBCON is named conn1:
    data: con_name like dbcon-con_name
    con_name = 'conn1'.
    exec sql.
      set connection :con_name
    endexec.
    ****have a select statement which reads data from your table which the data is saved from the APD into an internal table**********
    ****loop on the internal table and have a SQL insert statement to insert the records into the SQL table as below******
    exec sql.
    insert into <SQL TABLE> (column names seperated by ,)
    values (column names of the internal table seperated by ,)  if the internal table is named itab the columns have to be specified as :itab-column1
    If you decide to take this approach you may find more info on DBCON and the process in sdn. Good Luck!
    endexec.

  • Extract PSA Data to a Flat File

    Hello,
    I would like to download PSA data into a flat file, so I can load it into SQL Server and run SQL statements on it for analysis purposes.  I have tried creating a PSA export datasource; however, I can find way to cleanly get the data into a flat structure for analysis and/or download. 
    Can anyone suggest options for doing this?
    Thanks in advance for your help. 
    Sincerely,
    Sonya

    Hi Sonya,
    In teh PSA screen try pressing Shift and F8. If this does not bring up the file types then you can try the following: Settings > Chnage Display Variants > View tab > Microsoft Excel > Click SAP_OM.xls and then the green check amrk. The data will be displayed in excel format which you can save to your PC.
    Hope this helps...

  • HELP!! - Need PL/SQL to write to a flat file!!

    I'm trying to query information about a customer's salesrep, and append the results to a flat file. I'm a beginner, and the following pseudocode is the best I have so far. Any advice would be much appreciated.
    Thanks in advance!!
    Paul
    CREATE OR REPLACE PROCEDURE paul IS
    file_handle utl_file.file_type;
    mgrname CHAR;
    mgrphone CHAR;
    mgrext CHAR;
    BEGIN
    utl_file.open('C:\WINNT\Profiles\pking\Desktop\outputfile.txt','w');
    SELECT
    name
    ,attribute7
    ,attribute8
    INTO
    mgrname
    ,mgrphone
    ,mgrext
    FROM
    ra_salesreps_all
    rem WHERE
    rem X-X-X-X-X
    rem
    rem EXCEPTION
    rem WHEN no_data_found THEN
    rem NULL;
    utl_file.putf(file_handle, mgrname, mgrphone, mgrext);
    utl_file.fclose(file_handle);
    END paul;
    null

    Below is a simple one....
    Procedure WRITE2FILE
    id_h in integer,
    matter in varchar2 default null
    IS
    v_FileHandle utl_file.file_type;
    root_dir varchar2(200);
    file_h varchar2(100);
    BEGIN
    file_h := 'msg_'&#0124; &#0124;id_h&#0124; &#0124;'.txt'; -- you can give dynamic file name
    root_dir := 'unix_or_nt/home/file_dir';
    v_FileHandle := utl_file.fopen(root_dir,file_h,'w');
    if matter is not null then
    utl_file.put_line(v_FileHandle,'Additional Information');
    utl_file.put_line(v_FileHandle,'------------------------------------------------------------------');
    utl_file.put(v_FileHandle,matter);
    utl_file.new_line(v_FileHandle,1);
    utl_file.put_line(v_FileHandle,'------------------------------------------------------------------');
    else
    utl_file.put(v_FileHandle,matter);
    end if;
    utl_file.fflush(v_FileHandle);
    utl_file.fclose_all();
    exception
    when utl_file.invalid_path then
    DBMS_OUTPUT.PUT_LINE('Invalid path:');
    when utl_file.invalid_mode then
    DBMS_OUTPUT.PUT_LINE('invalid_mode');
    when utl_file.invalid_filehandle then
    DBMS_OUTPUT.PUT_LINE('invalid_filehandle');
    when utl_file.invalid_operation then
    DBMS_OUTPUT.PUT_LINE('Invalid_operation. ');
    DBMS_OUTPUT.PUT_LINE('The File is not available.');
    when utl_file.read_error then
    DBMS_OUTPUT.PUT_LINE('read_error');
    when utl_file.write_error then
    DBMS_OUTPUT.PUT_LINE('write_error');
    when utl_file.internal_error then
    DBMS_OUTPUT.PUT_LINE('internal_error');
    when others then
    DBMS_OUTPUT.PUT_LINE(4, 'A problem was encountered while writing the document.');
    end ;
    Calling procedure>>>>>>>>>
    execute write2file(100,'Prints the matter in here.');
    will result in a file with name msg_100.txt and the contents of the file will be...
    Additional Information
    Prints the matter in here.
    1)Make sure the directory has write permissions
    2)Initialization parameter UTL_FILE = 'unix_or_nt/home/file_dir' on database server. If not, then put this in init.ora (ask your DBA) and restart the db.
    3)check the syntax for the '/' and '\' depending on your OS
    null

  • Loading of flat file (csv) into PSA – no data loaded

    Hi BW-gurus,
    We have an issue regarding loading a flat file (csv) into PSA using an infopackage u2013 (BI 7.0)
    The infopackage has been used for a while. Prior the consultants with SAP_ALL-profile have run the infopackage. Now we want a few super users to run the infopackage.
    We have created a role for the super users, including authorization objects:
    Data Warehousing objects: S_RS_ADMWB
    Activity: 03, 16, 23, 63, 66
    Data Warehousing Workbench obj: INFOAREA, INFOOBJECT, INFOPACKAG, MONITOR, SOURCESYS, WORKBENCH
    Data Warehousing Workbench u2013 datasource (version > BW 3.x): S_RS_DS
    Activity: All
    Datasource: All
    Subobject for New DataSource: All
    Sourcesystem: FILE
    Data Warehousing Workbench u2013 infosource (flex update): S_RS_ISOUR
    Activity: Display, Maintain, Request
    Application Component: All
    InfoSource: All
    InfoSource Subobject: All values
    As mentioned, the infopackage in question, has been used by consultants with SAP_ALL-profile for some time, and been working just fine.  When the super users with the new role are executing the infopackage, the records are found, but not loaded into PSA. The load seems to be stuck, but no error message occurs. The file we are trying to load contains only 15 records.
    Details monitor:
    Overall status: Missing messages or warnings (yellow)
    Requests (messages): Everything ok (green)
      ->  Data request arranged (green)
      ->  Confirmed with: OK (green)
    Extraction (messages): Errors occurred (yellow)
      ->  Data request received (green)
      -> Data selection scheduled (green)
      -> 15 Records sent (0 Records received) (yellow)
      -> Data selection ended (green)
    Transfer (IDocs and TRFC): Missing messages (yellow)
    Processing (data packet):  Warnings received (yellow)
      -> Data package 1 (? Records): Missing messages (yellow)
         -> Inbound processing (0 records): Missing messages (yellow)
         -> Update PSA (0 Records posted): Missing messages (yellow)
         -> Processing end: Missing messages (yellow)
    Have we forgotten something? Any assistance will be highly appreciated!
    Cheers,
    Anne Therese S. Johannessen

    Hi,
    Try to use the transaction ST01 to trace the authorization of the upload with the SAP_ALL. 
    And the enhance your Profile for the super user.
    Best regards
    Matthias

  • Storing Persistent Data In A Flat File -- Design Ideas?

    I have an application that needs to store a small amount of persistent data. I want to store it in a flat config file, with categories and key-value pairs. The flat file might look something like this:
    John:
    hair=green
    weight=170
    Sally:
    eyes=blue
    weight=110
    and so on. My application will initialize a custom class with the data stored in the file, and then work with that class. When updates are made to the data as the application runs, the file will need to be changed too (so that changes will be reflected even if the program crashes, eg).
    What is the best way to implement this? Does Java have any built in classes that allow for something like this? I was thinking about Serializable (which I've never used), but I want the file to be human readable and editable. How about using RandomAccessFile? I'm guessing there is a better way....
    Thanks for any advice,
    John

    I'd use a XML structure; classes for XML storing/parsing are part of the API, the structure of XML is flexible enough and human-readable.

  • Data Source creation for Master Data from Flat File to BW

    Hi,
    I need to upload Master Data from Flat File. Can anybody tell step by step right from begining of creation of DataSource up to Loading into Master Data InfoObject.
    can any body have Document.
    Regards,
    Chakri.

    Hi,
    This is the procedure.
    1. Create m-data with or without attributes.
    2. Create infosource .
        a) with flexible update
             or
        b) with direct update
    3. Create transfer rules and assign tyhe names of m-data and attribute in "Transfer rules" tab and transfer them to communication structure.
    4. Create the flat-file with same stucture as communication structure.
    5. if chosen direct update then create infopackage and assign the name of flat-file and schedule it.
    6. if chosen flexible update the create update rule with assigning name of the infosource and the schedule it by creating infopackage.
    Hope this helps. If still have prob then let me know.
    Follow this link also.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    Assign points if helpful.
    Vinod.

  • Converting Idoc flat file representation to XML

    Hi ,
    I went through the guide for How To Convert Between IDoc and XML in XI 3.0. I'm concerned with the second part of the guide which says convert from falt file representation of Idoc to XML. Can anyone tell me what are the other design and configuration objects to be created for this scenario ( message types,interfaces, mapping , etc )
    Also which step of the pipeline does the converted XML goes to ?
    The program also expects a filename, what if I want to pass the file name dynamically ? Any ideas on this one.
    Hope someone replies this time.........:)
    Thanks for you help and improving my knowledge
    Thanks
    Advait Gode.

    Hi Advait,
    Let me give you a small overview on how inbound IDOCs work before answering your question-
    The control record is the key in identifying the routing of the IDOC. If you try to think IDOCs as normal mails(post), the control record is the envolope. It contains information like who the sender is and who the receiver should be and what the envelope contains (no different than receiving mails/letters by post).
    Then the data records contain the actual data, in our example would be the actual letter. The status records contain the tracking information.
    Traditionally SAP's IDOC interface (even before XI comes in picture) has utility programs to post incoming IDOCs in to SAP. One such program is RSEINB00 which basically takes  the IDOC file name and the port as input. This program opens the file and posts the contents to the SAP IDOC interface (which is a set of function modules) via the port. The idea is to read the control record and determine the routing and further posting to application. Note that one information in the control record is the message type/idoc type which decides how the data records need to be parsed.
    Now in XI scenario, what happens if we receive data as flat file? Normally, we use flat file adapter and in the file adapter we provide information on how to parse the file. But, if the incoming file is flat and in IDOC structure, why do we have to configure the file adapter, when the parsing capability is already available using RSEINB00/Standard IDOC interface.
    This the reason, the guide suggests you to use RSEINB00. Now, your concern is what if you need to provide a dynamic filename. My idea is to write a wrapper program. This would be an ABAP program in your integration engine. This program will determine the file name (based on a logic which should be known to you) and then call program RSEINB00 using a SUBMIT/RETURN. You would then schedule this ABAP program in background to run in fixed schedules.
    There are other ways of handling your scenario as well but from limited information from your request, I will stop with this now. Post me if you have any more queries.
    KK

Maybe you are looking for

  • Why should you explicitly open and close shared variable connections?

    I'm looking into switching over from the old Datasocket API to the new Shared Variable API for programmatic access to shared variables, and I noticed that LV doesn't seem to have any problems executing Shared Variable Reads & Writes without first ope

  • ICal cuts off text of some events in month view

    I've noticed that iCal cuts off the text of some events, while displaying the full text of longer events in the month view. For example, if I create an event titled "Test Event" only "Test" shows up in the day it's created, but if I enter "Longer Eve

  • OBIEE 11g on windows issue

    Has anyone had any success in installing OBIEE (11.1.1.5), DAC (10.1.3.4.1), Informatica (9.0.1 hotfix 2) and Oracle BI Apps (7.9.6.3) on Linux 64 bit? We have been provided 2 physical Linux servers (say A & B) and 1 virtual server (say C) for the sa

  • VF03-- Cash Discount(Header Level ) - Amount, say, 3%,   Where stores?

    Hi Experts, The VF03(Display Billing Doc)--> Cash Discount(Header Level ) - Amount, say, 3%,  In which table, it stores? Path is, VF03>Menu>GO TO>HEADER->Pricing Conditions Header>(Condition Type)SKTV> Amount, say, 3%------->Where it stores? Thanq.

  • How can I split an Orders05 iDoc into multiple iDocs?

    Hello, Can one of you experts please tell me how I can split an ORDERS05 iDoc into multiple iDocs based upon Delivery addresses?  For example, if the iDoc contains 3 different Delivery Addresses then I need to split it into 3 iDocs. It has been sugge