Data Processing in PI 7.1

Hi Experts
I am working on a Proxy to File scenario where Proxy would be sending the data to XI and File will be created, Data from the sender will be appending to the file till 6:00pm
On the sender File is having the Four different record structures,
Header1
Detail1
Detail2
Trailer1 
Header only needs to be written one time at the time file creation, During the time frame 12:00am to 11:50 Detail Records needs to be appended to the file,
At the end of the day the trailer1 record needs to be written and Other CC will be pick up the file and send it to the Vendor,
How can I achive this, (Like one header at the begining of the day, and Footer at the end of the day,
Do I need to use a BPM to do this, Is there any other way to acive this without BPM
Thanks   
PR

How can I achive this, (Like one header at the begining of the day, and Footer at the end of the day,
For Header:
Since it will be created and sent at the begining of the message processing you can send it in the usual way....in the mapping just create the target structure resembling the header.
For Footer:
Since it is required to be send only at the end of the day.....Create it along the Header....in Receiver Determination have two receivers (one for Header.....one for Footer....Receiver/ location for Footer will be dummy not the one of the vendor).
Now this will create the Footer file at some location X.....now create another Sender Channel which will pick this Footer file at the end of the day (make use of Availability Time Planning).....and send it to the vendor at that time....
In the receiver file Channel the construction mode will be append.
Personally i wont suggest to use a BPM...why?....the BPM instance will keep on running for entire day....and it may not be a good approach....my take on your problem
If you find an optimizing solution for this problem, then do not forget to mention it here...it will help the community in future
Regards,
Abhishek.

Similar Messages

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • HKONG: Material Transaction Interface 의 data를 처리 Process들에 대한 정의

    PURPOSE
    Material Transaction Interface 의 data를 처리 Process들에 대해 정의하고자 함.
    Explanation
    관련된 정보는 다음과 같습니다.
    Material Transaction Interface 의 data를 처리하기 위해서는 다음의 2개의 Process가 수행됩니다.
    - INCTCM (Process transaction Interface)
    - INCTCW (Inventory transactions worker)
    (1)
    Records are processed into this table by the INCTCM - Process Transactions Interface from the Interface tables :
         MTL_TRANSACTIONS_INTERFACE     to MTL_MATERIAL_TRANSACTIONS_TEMP
         MTL_TRANSACTION_LOTS_INTERFACE     to MTL_TRANSACTION_LOTS_TEMP (Lot 사용시)
         MTL_SERIAL_NUMBERS_INTERFACE     to MTL_SERIAL_NUMBERS_TEMP (serial 사용시)
    ==> INCTCM 에 의해 interface table의 data가 validation후, temp table로 옮겨집니다.
    (2)
    After the records are processed from the MTL_TRANSACTIONS_INTERFACE into the MTL_MATERIAL_TRANSACTIONS_TEMP
    by the INCTCM - Process Transactions Interface,
    a worker will be launched to process the record from MTL_MATERIAL_TRANSACTIONS_TEMP into MTL_MATERIAL_TRANSACTIONS.
    The worker is called INCTCW - Inventory Transaction Worker.
    The INCTCM - Process Transactions Interface will launch a single INCTCW - Inventory Transaction Worker for all rows
    that meet the criteria in MTL_MATERIAL_TRANSACTIONS_TEMP :
         TRANSACTION_MODE     = 3
         LOCK_FLAG          = N
         PROCESS_FLAG          = Y
    Once the process is complete the records will be moved into the corresponding
    transaction table :
         MTL_MATERIAL_TRANSACTIONS_TEMP     to MTL_MATERIAL_TRANSACTIONS
         MTL_TRANSACTION_LOTS_TEMP     to MTL_TRANSACTION_LOT_NUMBERS
         MTL_SERIAL_NUMBERS_TEMP          to MTL_UNIT_TRANSACTIONS
    ==> INCTCM은 INCTCW를 call하게 되고, 이 Process에 의해 TEMP table로부터 MMT table과 Inventory table에
    DATA가 Insert됩니다.
    The rows in mtl_transactions_interface are processed in 5 phases.
    1. Derives dependant columns, eg:acct_period_id, primary_quantity etc..
    2. Detailed validation performed on the records
    3. On hand qty check for negative qty's etc..
    4. Reservations Relieved if demand was created in order entry
    5. Rows are moved to mtl_material_transactions_temp where the
    transaction processor is called to process these rows and update the inventory levels etc..
    Reference Documents
    -------------------

  • Custom Data Processing Extension and Parameters

    Hello,
    I have successfully implemented a custom data processing extension. Now I'm trying to add parameters, but the documentation is very weak on this topic. Also all examples of custam data processing extensions I have found so far are working without parameter support. Using a search engine I have found a custom data processing extension's sequence diagram during report execution. The related article also gives some hints about the IDbCommandAnalysis interface.
    The GetParameters implementation creates a parameter collection based on the current command text and returns this collection. So far no problem.
    My question is how to implement the IDbCommand's CreateParameter method and Parameters property? In CreateParameter I just return a new instance of my IDataParameter implementation, and the Parameters property returns a new parameter collection based on the current command text.
    Is this ok so? At least this is the first implementation which seems to work in the data designer. If I share an instance variable for GetParameters and Parameters I will either get all parameters twice or BIDS will crash when pressing the execute button in the dataset designer - depending on my actual implementation.
    What I'm still missing is how to get the parameters added automaticall to the dataset's properties parameter register?
    thanx in advance and kind regards, Gerald

    I think that finally (after some days of trying) I could make it work. The dataset and report parameters are reflecting the command text's parameters now (BTW: to answer my last question - the dataset's properties are updated after pressing the refresh button only). The road to success was not to use the same member variable for the collections returned by IDbCommandAnalysis:GetParameters() and IDbCommand:Parameter and not to fill the latter one with the command text's parameters (see partial source code below).
    public sealed class MyCommand : IDbCommand, IDbCommandAnalysis  
        private string m_commandText;  
        private MyParameterCollection m_parameters = new MyParameterCollection();  
        public string CommandText  
            get { return this.m_commandText; }  
            set { SetCommandText(value); }  
        public IDataParameterCollection Parameters  
            get { return m_parameters; }  
        public IDataParameter CreateParameter()  
            return new MyParameter();  
        public IDataParameterCollection GetParameters()  
            return GetParameterCollection();  
        private void SetCommandText(string commandText)  
            m_commandText = commandText;  
        private MyParameterCollection GetParameterCollection()  
            // ... create parameter collection based on m_commandText  
    Still there are some open questions:
    How can I update the report parameter's properties? Now the parameter's data type is always string, but I'd like to assign the right data type. I'd also like to tag the parameters as hidden and supply a non-queried default value. Is this possible? How?
    Although I my implementation of IDbCommandAnalysis:GetParameters() returns a collection containing the parameter names and values the "prompt query parameters" dialog (after pressing the query designer's execute button in the toolbar) shows a "<blank>" value for the parameters. What could be the problem?
    Thanx in advance and regards,
    Gerald

  • Text Data Processing 4.1 Training in February & April for APJ & EMEA

    We would like to announce upcoming training opportunities in Malaysia and Germany for partners and select customers on Text Data Processing within the upcoming Data Services 4.1. Get early hands-on access to software and participate in the review of the product roadmap.
    Why Partner Test?
    - Receive an introduction to the product and how text analytics capabilities are being delivered through Data Services
    - Only source for partners and select customers to get deep product knowledge directly from development at this early stage
    - Better understand migration scenarios for BusinessObjects Text Analysis XI customers
    Test Scope
    - Entity Extraction from unstructured data in one or more of the 31 supported languages
    - Data Quality transforms for post-processing such as Match and Address Cleanse
    - Sentiment analysis in either English, German, French or Spanish
    Time Frame & Location
    - Kuala Lumpur, Malaysia: February 13-16  >>> FULL <<<
    - Walldorf, Germany: April 16-17 and 18-20  >>> FULL <<<
    Tester Profile
    - Data Services knowledge helpful
    - Previous experience with text analytics or text mining
    - Existing use cases for text analysis and sample data to test
    Tester Demand
    - 2-4 days depending on session
    - No registration fee but attendees must provide their own travel and lodging
    For further Text Data Processing info: http://wiki.sdn.sap.com/wiki/display/BOBJ/TextDataProcessing
    If you are interested in attending or have questions please contact Anthony Waite @ SAP.
    Edited by: Anthony Waite on Feb 7, 2012 4:58 PM

    Did you tried this document:
    How to find the Missing Records for the Target Table during Post Load validation at Business Objects Data Services - Bus…

  • New tutorials posted for Text Data Processing on Data Services 4.0

    Check out the 3 new Text Data Processing tutorials available at http://wiki.sdn.sap.com/wiki/display/BOBJ/TextDataProcessing within the Product Tutorials section.
    -- Introduction to Entity Extraction Transform: this 15-minute demo provides an overview of the Text Data Processing Entity Extraction transform for Data Services 4.0. The Entity Extraction transform enables you to process unstructured text, such as web pages, news articles, maintenance logs, text from a spreadsheet, or even from a database column, to extract key pieces of information that you can use in query and reporting.
    -- Using Text Data Processing Blueprints: this 6-minute demo explains how to get up and running with TDP quickly using a series of jobs contained in a blueprint. A blueprint is a sample end-to-end solution available on SAP Community Network.
    -- Creating an Extraction Dictionary: this 13-minute demo explains how to create an Extraction Dictionary, which can be used to customize a TDP Entity Extraction transform to improve results.

    Thanks for sharing your use case.  I have a few questions if you don't mind:
    1) What file formats are typically found within these compressed containers?  For example, html, xml, txt, pdf, MS Office, etc.
    2) Do these compressed containers ever have other embedded compressed containers within them?  For example, a zip file containing other zip files within it.
    3) If the intention is to process document files, do any of the document files have other document formats embedded/nested within them?  For example, a MS Word document with a spreadsheet embedded within it.

  • Custom Data Processing Extension, use in SSRS Report Properties - References

    I've built a Custom Data Processing Extension (CDPE) and registered it
    successfully (ie. it shows up in the new datasources dialog/drop-down and saves just fine, for VS2010-2014). It is intended to be a custom (XML-based) DataSource. However, based on the "nature of the beast", I also need to have a Custom Query
    Designer (CQD) for development  testing of the CDPE.
    Here are the errors I get for the CQD:
    Pulling a report up in "Report Preview", which is wired to the CDPE->CQD, I get:
    "An error occurred during local report processing. The definition of the report '/TestDS' is invalid. Error while loading code module: 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. Details:
    Could not load file or assembly 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken 89845dcd8080cc91' or one of it's dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception
    from HREULT: 0x80131040)"
    My CDPE directly includes Microsoft.ReportingServices.Interfaces.dll AND matches everything as far as version and key. It also includes Microsoft.ReportingServices.QueryDesigners.dll required for the CQD.
    I've written other WORKING CDPEs but not one with a CQD (Query Designer Custom replacement in Visual Studio). All the references from what I can tell are OK. I think CQDs are screwed up for XML datasources. The interfaces are not right.
    (will explain further on)
    From the "Data Sources", Dataset Properties, I click on the "Query Designer", I get:
    "An error occurred while loading the query designer 'DATASET' (which is the name of the CDPE). Query Designer: Object reference not set to an instance of an object."
    I "think" XML type CDPEs are trying to execute a web services call, versus working properly/CORRECTLY with a
    text-based query for XML. The reason I say this is that I've created both WinForm and WebForm test harnesses. They both come up with this error: "...Failed to prepare web request for the specified URL. (rsXmlDataProviderError), Invalid
    URI: The URI is empty." (which is nonsense, there is no request, the query is simply text/file-based stuff, and I read locally ALL of the XML data expected for testing without issue -> I'm ONLY making the CDPE XML-based because I have custom
    WCF calls which already work). (If you really want to understand overall architecture, please see my post: http://social.msdn.microsoft.com/Forums/en-US/d15d9206-95d7-473a-a7f9-a38b4279de8c/ssrs-extension-which-to-use?forum=sqlreportingservices&prof=required
    Other than "100 mile" overviews from Microsoft, this has got to be some of the worst documented stuff I've ever seen (
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner.aspx ). Remote Debugging it doesn't work 95% of the time.
    My environment is VS2013 Ultimate with BI and SQL Server 2012 SP1.
    Thanks Rob
    Rob K

    Update:
    I can now see the Custom Query Designer and get anticipated results (after some fooling around with different combinations).
    Here's how things were broken by the MS SQL Server 2012 product/release team:
    1. they upgraded to .Net v4.x here (to support SharePoint, AX, MS Data Tools, etc.)
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.QueryDesigners.dll
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.Interfaces.dll
    2. they left c:\Program Files\Microsoft SQL Server\MSRS.11.MSSQLSERVER\Reporting Services\ReportServer\bin\Microsoft.ReportingServices.Interfaces.dll at .Net
    v2.x
    3. they don't support Custom Extensions (which use a Query Designer) with anything higher  than .Net v3.5
    In my case, I had to segregate:
    a. Report Definition Custom Extension to v4.5
    b. Custom Data Processing Extension to v3.5
    c. Custom Query Designer to v4.x
    d. my WCF/SSO to v4.5.1.
    #2 and #3 above, in my humble opinion are simply dead wrong as far as what you ever want to do in the release cycle (I can see there being an early/first release exception(s), but 2 years out and a successor product (2014) should have meant that this
    was rectified more than a year ago.)
    Whomever failed to get this communicated in the 2012 documentation created even more havoc than an average developer can decipher:
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner(v=sql.110).aspx
    (I'm still working on how to get the remote debugger working consistently.)
    Rob

  • Calling a Stored Procedure using SSRS Custom Data Processing Extension

    I need SSRS Custom Data Processing Extension to call a stored procedure for my ssrs report. I refered many links regarding this, but i cannot find it. Instead of that there are examples for Data processing extensions that uses XML files and also multiple
    data sources.
    I want Data Processing Extension to call a stored procedure.
    Please Help. Thanks in advance

    Sorry why do you need a Data Processing Extension for that? Cant you directly call the procedure
    from SSRS dataset? Whats the RDBMS which holds this procedure?
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Data processing inside a bpel process

    I am unclear as to how I can execute some data processing while inside a bpel process. We have a situation where we may get multiple changes to a particular db record in succession but we only need to forward on the last entry to another db. I am unsure as to how I could do this inside of a bpel process. I thought about calling some java code to do it but was wondering if there might be other ways to accomplish this.
    Thanks,
    Steve

    Hi Vijay,
    Actuall i just want the script of executing procedure from a package for fetching records from database in a bpel process.
    I'm not having the technical ability about how to create a package and a procedure and then use it in a bple process.
    Forward me if you know any link for creating packages and procedures.
    Regards,
    Chakri

  • Error while executing master data process chian

    hi,
    I'm trying to execute Master Data Process Chain in BI 7.0. But i'm getting errors at the DTP process of two info objects. The error message is like "equest 357 is already being processed  and Exception CX_RSBK_REQUEST_LOCKED logged."
    Can anyone tell the reason and how to resolve this issue?
    Thanks
    Hima

    Hi Hima
    Check if there is any other loads, fetching the same Request OR you can goto SM37 and goto Job log and check the status of the JOB... if it is finished, then check the status of the request too.You can repeat the locked request, if the previous req is either successfully completed or failed...wait until the dependent request is finished
    just check this and repeat the req.. it should get successful now...
    Regards
    Rohit

  • Flat File Active Sync - Notify  admin incase of data processing errors

    Dear Friends,
    We have couple of Requirements to use OOTB flat file active sync adapter
    1. To read data from a flat file and update the records in Sun Identity Manager system
    2. Notify admin if there are any data processing errors while reading data from a flat file. The data processing errors can occur if there is an invalid data. for example, lets say the input flat file has 3 columns defined, but the file conatins records which has four values.
    firstname,lastname,email
    testfirst,testlast,[email protected],12345
    Req#1 is working fine. There are no issues with that.
    Req#2: if the file contains invalid data, i noticed that the active sync adapter throws an Array Index out of bound exception. so, we need to send an email notification to the admin whenever data processing errors occurs.
    I noticed that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to determine whether the data was read successfully or not.
    Please let me know if there are any configurations/customizations to me made on OOTB flat file active sync adapter to handle data processing errors and send email notifications to administrators.
    Appreciate your help
    Thanks
    Vijay

    Hi,
    We have same requirement that
    "Notify admin if there are any data processing errors from a flat file.
    The data processing errors can occur if there is an invalid data or account is locked etc..."
    In short notify admin if any error logged in Active sync Log file while active sync runs.
    Yes,I noticed same that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to go ahead to meet the requirement.
    Please let me know if there are any configurations/customizations to me made on flat file active sync adapter to send email notifications to administrators.
    Thanks,
    Sudheer

  • Our organization uses an Oracle database hosted on a Unix platform and one of our data processing outputs is a "stuffer" document that has a barcode, and Unix jobs automatically send the document to a printer.   Is there a way, or does Adobe have a produc

    Our organization uses an Oracle database hosted on a Unix platform and one of our data processing outputs is a “stuffer” document that has a barcode, and Unix jobs automatically send the document to a printer.
    Is there a way, or does Adobe have a product or solution, to create a PDF version of the document including the barcode, before it’s sent to a printer?

    What format is the document that is printed? Or what technology is used to format the printer? There isn't a standard way of doing things in Unix.

  • System Master Data Process Chain

    Hello Guys,
    I am working on Admin cockpit. So far it s going well. However, I am struglling with scheduling System Master Data Process chain. When I look at its logs, I see that Operation Type (WHM)- Texts becomes red and Attribute Change Run (final step) has not finished yet(almost three hours) Please help.
    Thank you,
    OLGA

    refer this..similar issue is discussed Alpha Conversion
    Non Alpha Compliant Value..hw can this be resolved on BI 7.0 SP 13

  • URGENT : Error: Please create data processing function module

    Hi
    I am getting the folloiwng error
    "Please create data processing function module" in table maintainence.
    It is working fine in dev environment but not in QA and production.
    Pleae help.
    Regards
    Kalpesh

    Hi Kalpesh,
       I see it is something to do with the naming convention used in the quality and production system. These must have been setup differently compared to ur development system. So check with different naming conventions.
    Reward points if helpful.
    Cheers
    Shafiq

  • Error during data processing

    Hello,
    I try to load Arccrual Engine Costing Results from DSO (0ACE_DS01) to Leasing Accounting Key Figures DSO (0FIL_DS04) but during the data processing I get Error 2 in update:
    Cannot determine fiscal year/period for date 99993112 and variant.
    Errorcode: RS_BCT_CRM_FIN006
    Could somebody help? Thanx in advance

    Hello Prakasubw,
    yes this could be the error. in infopackage i enter for the fiel datefrom: 01012005
    and in field dateto: 31129999
    , because in the infopackage the fields show lenght 8!!!
    what do you think should I enter in these fields in the infopackage:
    datefrom:
    dateto:
    Thanx in advance!
    may I can you send a screenshot with the error and the infopackage selection?
    Message was edited by:
            Rusty

  • Size of data processed by the engine

    The Power Query specification here describes the "Size of data processed by
    the Engine" as:
    "Limited to available virtual memory (for 64-bit version) or about 1GB for 32-bit version, if data cannot be fully streamed, such as when sorting the data set locally before filling it"
    What is meant by the part where it says "if data cannot be fully streamed, such as when sorting the data set locally before filling it"?
    Is it saying that the limits don't apply when sorting? Is it applicable to 32-bit only?
    Thanks

    Power Query will try to tell the server to do the transformations on their side if they can which means that Power Query does not need to store all of the data locally. In  certain cases this is not possible, so Power Query has to download the data
    locally before doing the transformation, which takes up memory.
    In this case, if we have to do the sort locally (because the user has buffered the result or because the data source we are using cannot be streamed), then the 32-bit version of Power Query can use up to around 1 GB of memory while the 64-bit version of
    Power Query can use as much virtual memory that's available on the system.
    I hope that helps.
    Thanks.

Maybe you are looking for

  • When I go to send an email I get a message saying that it has been placed in the outbox because the email address does not allo relaying. This happens intermittently

    When I want to send an email, I. Get a message saying that the email has been placed in the outbox because the email address that I am sending it to does not allow relaying? This is intermittent and often later the email is sent.

  • Patience the only solution h

    As frusterating as it is, patience is the only thing thats going to work here. The software and beta drivers for Vista? for all intents and purposes?do not work. Since the tentati've release date for working software and drivers is March 8th, its not

  • Lost dates in scroll mode

    I just installed iphoto 08 a few days ago, and I've notice the month/year feature when scrolling w/ the right hand scroll bar is missing. All that's there in the black transparent window are 5 white dots. Any ideas or suggestions? thanks, runnin' man

  • User Session Variable

    I'm new in php, I recently created login user name and password, is works great, and now I want to work on member update record. I'm trying to understand how to develop that using session variable in dreamweaver. I have two pages, the first page logi

  • Ordering and Reordering rows in a Matrix

    Hi, I am using a DBDataSource (User Defined) to populate a Matrix. I use Conditions to filter this matrix. I have a field called Seq in the Matrix and want to be able to order the matrix by Seq. Can this be done without using a SQL Query in the datas