FI Batch Data Processing

Hi All,
Few days back, I got Run Time Error problem with the system due to
" Runtime error COMPUTE_BCD_OVERFLOW has occurred".
Now I have solved the issue and I would like to check that,
How about those missing batches early and is there any way to retrieve the missing batches?
Please advise me that how sholud I resolve?
Thanks with regards,
Bala

Dear Bala,
You can run the same file. You can check the session information in SM35.
Regards,
Naveen.

Similar Messages

  • Batch data process into PLSQL by webservice

    hi
    how to perform a batch data process into EBS?
    e.g I have a large amont around 10'000 line of data,and about to import them into EBS , but I don't know if it will has trouble for the webservice and plsql .
    any idea?

    Using the PL/SQL table IN parameter.

  • How can I recover from a Botched BATCH DATE CHANGE in iPhoto?

    Botched Batch Date Change iPhoto08
    Short Story: Intending to change the date for one photo I managed to change the dates for all 8,525 images in my iPhoto08 library to the SAME DATE!! I found that the original creation dates were saved in the in the Modified field that appears in the Photo Info Window / File metadata subsection. I would like some help figuring out how to set the Date Created to this File Modified Date that appears in the Photo Info Window.
    GORY DETAILS:
    I used iPhoto08/ Photos menu / BATCH CHANGE... /“Set DATE to 20100723 1:20:54 PM” with “Modify original files” checked. iPhoto changed all 8,525 images to the same date.
    I now realize I should have used Photos menu / Adjust Date and Time ... But the damage was already done. Unfortunately there was no Edit menu / Undo for this operation. ALL 8,525 images have the same date. I checked in the the iPhoto Information sidebar--same date all images.
    Next, as a test, I dragged an image to the desktop and opened it in Graphic Converter7.0.3. The same date “20100723 1:20:54 PM” appeared in the Image Browser List /”Date Created”, “Date Modified”, “Date Captured” metadata fields.
    The following appeared in the Graphic Converter
    Image Preview sidebar (subWindow) going left to right
    Image menu/button
    Date and Time: Friday, July 23, 2010 1:20:54 PM PT
    Exif menu/button
    File date and time: 2010:07:23 13:20:54
    Date and time of original data generation: 2010:07:23 13:20:54
    Unknown tag (36868): 2010:07:23 13:20:54 <- significant?
    ExifTool menu/button
    ExifTool Version Number:8.40
    Warning: [minor] Suspicious MakerNotes offset for tag 0x9001
    ----System----
    File Modification Date/Time: 2010:07:23 13:20:54-07:00
    ----IFDO----
    Modify Date: 2010:07:23 13:20:54
    ----ExifIFD----
    Date/Time Original: 2010:07:23 13:20:54
    Create Date: 2010:07:23 13:20:54
    <<<<<<OH MY GOD >>>>>> the BATCH CHANGE / SET DATE not only RESET the date for EVERY IMAGE in my library, it also changed the date metadata EVERYWHERE inside each individual image file.
    Actually it's my Mom and Dad's iPhoto library containing all the kids and grand kids pictures, family trips, important events, everything. There is no backup. I feel as if I have tipped over a file cabinet containing all our photos. Our “date and event” organized 8,525 image iPhoto library was now as useful as a giant shoebox. On second thought, a shoe box full of real photo prints would be more useful than my iPhoto library because the prints would have the print date stamped on them.
    At this point feeling desperate I closed my eyes and said a quick prayer to Saint Rita - Patron Saint of the Impossible.
    I returned to iPhoto .................and noticed the Show Photo Info MENU item in the Photo Menu.
    I selected it.
    And Low and behold,
    in the Photo Info WINDOW,
    under the Meta data File SUBSECTION, the correct photo creation date appeared -->" Modified: 20071002 05:46:07 PM" I checked different photos ... They all had different FILE MODIFIED dates. These dates were/are the old creation dates! My prayers were answered: THANK YOU SAINT RITA!
    But now I am really stumped! This File Modified Date seems to be some kind of external metadata that is indexed or keyed to each image. How is this metadata stored in the iPhoto library? How can I access this File Modified date? I now turn to you GREAT iPhoto WIZARDS. Pray tell, how can I set the “Creation Date to the file Modified Date” that appears in the Photo Info Window as a (scriptable?) batch process.
    I am on my knees for this one...please help. we have iphoto11 but afraid to upgrade to it. until i know more i don't want to loose this Modified Date metadata.

    thanks for the quick response...
    no simple solution because no backup to restore from
    iPhoto seems to be using file metadata as some sort of index key in the
    Photo Info Window. open up the iphoto library package and take a look at
    the .xml file -- ModDate is used everywhere. hoping to export this data and merge/join it to
    each image.
    still trying to figure it out.
    thanks,
    Tom

  • Splitting events after batch date change causes event to disappear

    Hi all,
    The basic (repeatable) scenario is this:
    o I scan in a bunch of old pictures I want to keep in iPhoto (lets say 10)
    o I import them into iphoto (no problems with import) 1 event with 10 pictures is created.
    o I then take a specific set of 4 photos and split them out of the main event into a new event (everything still ok) - 2 events, 4 and 6 events respectively.
    o I open the new event and do a batch date change on the 4 pictures, changing the date to the correct date of the picture
    o I exit the event expecting it to move up in the list of events (I sort by date). It's gone! iPhoto lists the 1 new event and its 6 photos
    o I change my sort (by time, by name, etc) and still not there.
    o I find my library in finder, open the package, and the files are still there.
    o I go back to iPhoto and click on "Photos". Walla! 10 pictures!
    o I undo the data change and they show up again in events - 2 new events, 10 total pictures.
    I've already done all the obvious checks (are you sorting by date, look up by even name, search by file name through all events etc.) I've talked with a product specialist and they said the standard "rebuild the library" which did not resolve the issue.
    I can repeat this until the cows come home in the existing library. I've created a brand new library and it does not repeat, do there is something in my library not right. But I don't buy the standard "redo your library from scratch" (36K pictures, 50 faces and hundreds of places). Just like with faces, you can discard the db files, there must be something besides "rebuild automatically and if that doesn't work sorry start over". Was iPhoto coded in Redmond?
    Has anyone had anything similar to this? I looked through the forums and couldn't see anything. If I have to rebuild this, I am done with iPhoto after having to rebuild due to faces too many times, and my question changes to what are folks recommendations for a more stable photo management software suite?
    Thanks!
    HOTG

    If the problem doesn't repeat in another Library then obviously the issue is with that library.
    As you've tried the basic rebuild: Back Up and try rebuild the library: hold down the command and option (or alt) keys while launching iPhoto. Use the resulting dialogue to rebuild. Choose to Rebuild iPhoto Library Database from automatic backup.
    The obvious solution is to restore a good version from your back up and, er, Walla! as you say yourself.
    Next least destructive option: Download iPhoto Library Manager and use its rebuild function. This will create a new library based on data in the albumdata.xml file. Not everything will be brought over - no slideshows, books or calendars, for instance - but it should get all your albums and keywords back.
    Because this process creates an entirely new library and leaves your old one untouched, it is non-destructive, and if you're not happy with the results you can simply return to your old one.
    Regards
    TD

  • Batch data communication

    How do i decide which method to use when uploading data from legacy system to sap?
    In session method
    suppose they are 10 records to upload ,if an error occurs at 6th record  when uploading data through session ,will the  7,8,9,10 records be processed ,what willhappen?
    what is direct input method

    Hi Aravind,
    Data Transfer 
    BDC
    Batch Data Communication
    It is used to transfer data from Sap to Sap or from Non Sap to sap system. It uses the normal transaction codes to transfer the data.
    Data Transfer Methods 
    You can use the following methods to transfer data:
    •        CALL TRANSACTION: Data consistency check with the help of screen logic.
    •        Batch input with batch input sessions: Data consistency check with the help of screen logic
    Difference between Batch Input and CALL TRANSACTION
    If the direct input cannot be used for your task, this makes creating a data transfer program easier since the underlying transactions ensure that the data consistency checks are executed.
    In the case of an error during the data transfer (if data records are inconsistent, for example), you can restart the transfer at the point in the program where the error occurred.
    Data Transfer Overview
    Batch input methods
    With the batch input method, an ABAP program reads the external data that is to be entered in the SAP system and stores the data in a “batch input session”. The session records the actions that are required to transfer data into the system using normal SAP transactions.
    When the program has generated the session, you can run the session to execute the SAP transactions in it. You can explicitly start and monitor a session with the batch input management function (by choosing System &#61614; Services &#61614; Batch input), or have the session run in the background processing system.
    CALL TRANSACTION methods
    In the second method, your program uses the ABAP statement CALL TRANSACTION USING to run an SAP transaction. External data does not have to be deposited in a session for later processing. Instead, the entire batch input process takes place inline in your program.
    Choosing Data Transfer Methods
    Selecting a Data Transfer Method  
    When you transfer data in ABAP, you have three options to submit the data for the data transfer. Only the first two methods can be recommended without reservation. The third method, by way of CALL DIALOG, is outmoded. CALL DIALOG is less comfortable than the other methods. You should use it only if you must.
    •     Use the CALL TRANSACTION USING statement
    Summary: With CALL TRANSACTION USING, the system processes the data more quickly than with batch input sessions. Unlike batch input sessions, CALL TRANSACTION USING does not automatically support interactive correction or logging functions.
    Your program prepares the data and then calls the corresponding transaction that is then processed immediately.
    The most important features of CALL TRANSACTION USING are:
    o     Synchronous processing
    o     Transfer of data from an individual transaction each time the statement CALL TRANSACTION USING is called
    o     You can update the database both synchronously and asynchronously
    The program specifies the update type
    o     Separate LUW (logical units of work) for the transaction
    The system executes a database commit immediately before and after the CALL TRANSACTION USING statement
    o     No batch input processing log
    •     Create a session on the batch input queue.
    Summary: Offers management of sessions, support for playing back and correcting sessions that contain errors, and detailed logging.
    Your program prepares the data and stores it in a batch input session. A session is a collection of transaction data for one or more transactions. Batch input sessions are maintained by the system in the batch input queue. You can process batch input sessions in the background processing system.
    Your program must open a session in the queue before transferring data to it, and must close it again afterwards. All of these operations are performed by making function module calls from the ABAP program.
    The most important aspects of the session interface are:
    o     Asynchronous processing
    o     Transfers data for multiple transactions
    o     Synchronous database update
    During processing, no transaction is started until the previous transaction has been written to the database.
    o     A batch input processing log is generated for each session
    o     Sessions cannot be generated in parallel
    The batch input program must not open a session until it has closed the preceding session.
    Executing Data Transfer Programs 
    Procedure
    If you are using an SAP data transfer program, follow the procedure specified in the program documentation.
    If you are using a generated data transfer program, proceed as follows:
    1.     Start the data transfer program.
    2.     Decide which batch input method you want to use for the data transfer.
    a) CALL TRANSACTION USING:
    You must specify the:
    – Processing mode: You use this parameter to specify whether processing should take place in the background or in dialog mode.
    Possible values are:
    A     Display all
    E     Display only errors
    N     No display
    – Update mode: This parameter determines how the data is to be updated:
    Possible values are:
    S     Synchronous
    A     Asynchronous
    L     Local update
    – Error session: Here you have the option to specify a session name for a batch input session in which data is to be written in the case of an error. You can use this to identify incorrect data records after the batch input program has run and to import the records into the R/3 System once you have corrected them.
    If you are creating an error session, you must also specify:
    – User: Specify the user with whose authorizations the sessions are processed.
    – Keep session: This specifies whether or not the session should be deleted once it has been processed.
    – Lock date: Specify the processing date for the error session.
    b) Generate session:
    – Session name: Specify a name for the batch input session to be generated.
    – User: Specify the user with whose authorizations the sessions are processed.
    – Keep session: This specifies whether or not the session should be deleted once it has been processed.
    – Lock date: Specify the processing date for the error session.
    3.     Specify a character that is to be used as the NODATA character.
    4.     Specify the path of the data file from which the data is to be imported into the R/3 System.
    5.     Execute the program.
    6.     If you have generated a session, or if errors occurred in CALL TRANSACTION USING mode, you must now edit the generated sessions. You can find information on this in BC - System services in  batch input sessions.
    Creating a Session with BDC_OPEN_GROUP  
    Use the BDC_OPEN_GROUP function module to create a new session. Once you have created a session, then you can insert batch input data into it with BDC_INSERT.
    You cannot re-open a session that already exists and has been closed. If you call BDC_OPEN_GROUP with the name of an existing session, then an additional session with the same name is created.
    A batch input program may have only one session open at a time. Before opening a session, make sure that any sessions that the program closes any sessions that it previously had opened.
    BDC_OPEN_GROUP takes the following EXPORTING parameters:
    •        CLIENT
    Client in which the session is to be processed.
    Default: If you don't provide a value for this parameter, the default is the client under which the batch input program runs when the session is created.
    •        GROUP
    Name of the session that is to be created. May be up to 12 characters long.
    Default: None. You must specify a session name.
    •        HOLDDATE
    Lock date. The session is locked and may not be processed until after the date that you specify.  Only a system administrator with the LOCK authorization for the authorization object Batch Input Authorizations can unlock and run a session before this date.
    Format: YYYYMMDD (8 digits).
    Default: No lock date, session can be processed immediately. A lock date is optional.
    •        KEEP
    Retain session after successful processing. Set this option to the value X to have a session kept after it has been successfully processed. A session that is kept remains in the input/output queue until an administrator deletes it.
    Sessions that contain errors in transactions are kept even if KEEP is not set.
    Default: If not set, then sessions that are successfully processed are deleted. Only the batch input log is kept.
    •        USER
    Authorizations user for background processing. This is the user name that is used for checking authorizations if a session is started in background processing. The user must be authorized for all of the transactions and functions that are to be executed in a session. Otherwise, transactions will be terminated with “no authorization” errors.
    The user can be of type dialog or background. Dialog users are normal interactive users in the SAP system. Background users are user master records that are specially defined for providing authorizations for background processing jobs.
    Advantages:
    Types of BDC :
    CLASSICAL BATCH INPUT (Session Method)
    CALL TRANSACTION
    Session method.
    1) synchronous processing.
    2) can tranfer large amount of data.
    3) processing is slower.
    4) error log is created
    5) data is not updated until session is processed.
    Call transaction.
    1) asynchronous processing
    2) can transfer small amount of data
    3) processing is faster.
    4) errors need to be handled explicitly
    5) data is updated automatically
    <b>Regards,
    Azhar</b>

  • How to restore system RAM memory when i run DI batch job process

    How to restore system RAM memory when i run DI batch job process
    I use SAP-B12005SP53 and i develop with Business Partner Object

    HI ALL
    TAKE CARE: Nothing was working in VB6, but in .NET it sets the default value !
    from help of VS.
    When you assign Nothing to an object variable, it no longer refers to any object instance. If the variable had previously referred to an instance, setting it to Nothing does not terminate the instance itself. The instance is terminated, and the memory and system resources associated with it are released, only after the garbage collector (GC) detects that there are no active references remaining.
    Garbage Collector distroyes the objects at the end of the sub of function, not in the loop.
    So use local variables when you loading the 7000 items. For me working fine and i do not have a memory problems. I am loading 50000 items with prices from EDI file, and memory usage stops at 120MB.
    1st solution  create a sub or a function to populate the data, and run this for every items by as single rutin,
      sub x
        Dim obj As New Object()
            Try
    ... put here your update code
            Catch ex As Exception
            Finally
                GC.Collect()
            End Try
      end sub
    This free up the memory when Garbage
    2nd solution: use inner loop definition
      sub x
    Try
                Do While True
                    Dim obj As New Object()
                Loop
            Catch ex As Exception
            Finally
                GC.Collect()
            End Try
      end sub
    This free up the memory inside the loop.
    I recommend to use Friend Class keyword for your addon declaration and do not use modules.
    Modules and shared classes will be free up the memory when your program terminates execution.
    For more information see MSDN help http://msdn.microsoft.com/en-us/library/7ee5a7s1(VS.80).aspx
    and http://msdn.microsoft.com/en-us/library/0x9tb07z(VS.80).aspx
    Regards,
    J
    Edited by: János Nagy on Oct 15, 2009 2:17 PM

  • Automating F110 through a batch job process

    Is there a possibility of automating F110 through a batch job process? If yes, could somebody explain me how can that be done.

    Hi Shilpa ,
    To Run F110 First we hav to give run Date & identification,
    1) select PARAMETER TAB Enter Posting date , Document Date , Company code Pament method , Next pament date And specify the Customer / Vendor Number for which u like to do the payment.
    2) in Additional Log Select Check Box Due date check , Payment method selection if not scussefull, Line items of the payment documents & Vendor / Customer number . SAVE THIS.
    3) Slect the PROPOSAL RUN ( START IMD )
    4) Select PAYEMNT RUN ( STArt IMD) RUN
    5) if u want to print the document Mantain varint & Specify the varint in front the progeramme
    HOPE THIS WILL HELP YOU

  • Batch Data Capture of transaction 'MIRO' via background execution

    Hi All,
    I am currently working on a report that amongst other things conducts a batch data capture (i.e. BDC) via a recording of transaction code 'MIRO' in order to create an Invoice Verification with the 'Invoice Receipt' number being the only expected output after successful execution.
    I am populating only the following fields (i.e. via the recording) with data processed within the report program:
    -  INVFO-BLDAT
    - RM08M-EBELN
    - INVFO-XBLNR
    - INVFO-WRBTR
    The problem however is that this same recording works when I execute the report in the foreground but does not work when I run it in the background work process.  Could there be a common and well known cause to this? 
    Please help!  I just need the same recording to be able to run in the background work process as well as it runs in the foreground.
    Thank you.

    Hi!
    For technical reasons, batch input and the new single screen enjoy                    
    transactions are no compatible. See Notes 381630 & 381593 for more                    
    detail. Notes are attached.                                                                               
    381630 0004 0.400 Batch input for Enjoy transaction in MM not possib                  
    381593 0008 0.430 MIRO/MIR7/MIRA: Batch Input is not possible                                                                               
    Hope the information helps you.                                                       
    Erika

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • HKONG: Material Transaction Interface 의 data를 처리 Process들에 대한 정의

    PURPOSE
    Material Transaction Interface 의 data를 처리 Process들에 대해 정의하고자 함.
    Explanation
    관련된 정보는 다음과 같습니다.
    Material Transaction Interface 의 data를 처리하기 위해서는 다음의 2개의 Process가 수행됩니다.
    - INCTCM (Process transaction Interface)
    - INCTCW (Inventory transactions worker)
    (1)
    Records are processed into this table by the INCTCM - Process Transactions Interface from the Interface tables :
         MTL_TRANSACTIONS_INTERFACE     to MTL_MATERIAL_TRANSACTIONS_TEMP
         MTL_TRANSACTION_LOTS_INTERFACE     to MTL_TRANSACTION_LOTS_TEMP (Lot 사용시)
         MTL_SERIAL_NUMBERS_INTERFACE     to MTL_SERIAL_NUMBERS_TEMP (serial 사용시)
    ==> INCTCM 에 의해 interface table의 data가 validation후, temp table로 옮겨집니다.
    (2)
    After the records are processed from the MTL_TRANSACTIONS_INTERFACE into the MTL_MATERIAL_TRANSACTIONS_TEMP
    by the INCTCM - Process Transactions Interface,
    a worker will be launched to process the record from MTL_MATERIAL_TRANSACTIONS_TEMP into MTL_MATERIAL_TRANSACTIONS.
    The worker is called INCTCW - Inventory Transaction Worker.
    The INCTCM - Process Transactions Interface will launch a single INCTCW - Inventory Transaction Worker for all rows
    that meet the criteria in MTL_MATERIAL_TRANSACTIONS_TEMP :
         TRANSACTION_MODE     = 3
         LOCK_FLAG          = N
         PROCESS_FLAG          = Y
    Once the process is complete the records will be moved into the corresponding
    transaction table :
         MTL_MATERIAL_TRANSACTIONS_TEMP     to MTL_MATERIAL_TRANSACTIONS
         MTL_TRANSACTION_LOTS_TEMP     to MTL_TRANSACTION_LOT_NUMBERS
         MTL_SERIAL_NUMBERS_TEMP          to MTL_UNIT_TRANSACTIONS
    ==> INCTCM은 INCTCW를 call하게 되고, 이 Process에 의해 TEMP table로부터 MMT table과 Inventory table에
    DATA가 Insert됩니다.
    The rows in mtl_transactions_interface are processed in 5 phases.
    1. Derives dependant columns, eg:acct_period_id, primary_quantity etc..
    2. Detailed validation performed on the records
    3. On hand qty check for negative qty's etc..
    4. Reservations Relieved if demand was created in order entry
    5. Rows are moved to mtl_material_transactions_temp where the
    transaction processor is called to process these rows and update the inventory levels etc..
    Reference Documents
    -------------------

  • Custom Data Processing Extension and Parameters

    Hello,
    I have successfully implemented a custom data processing extension. Now I'm trying to add parameters, but the documentation is very weak on this topic. Also all examples of custam data processing extensions I have found so far are working without parameter support. Using a search engine I have found a custom data processing extension's sequence diagram during report execution. The related article also gives some hints about the IDbCommandAnalysis interface.
    The GetParameters implementation creates a parameter collection based on the current command text and returns this collection. So far no problem.
    My question is how to implement the IDbCommand's CreateParameter method and Parameters property? In CreateParameter I just return a new instance of my IDataParameter implementation, and the Parameters property returns a new parameter collection based on the current command text.
    Is this ok so? At least this is the first implementation which seems to work in the data designer. If I share an instance variable for GetParameters and Parameters I will either get all parameters twice or BIDS will crash when pressing the execute button in the dataset designer - depending on my actual implementation.
    What I'm still missing is how to get the parameters added automaticall to the dataset's properties parameter register?
    thanx in advance and kind regards, Gerald

    I think that finally (after some days of trying) I could make it work. The dataset and report parameters are reflecting the command text's parameters now (BTW: to answer my last question - the dataset's properties are updated after pressing the refresh button only). The road to success was not to use the same member variable for the collections returned by IDbCommandAnalysis:GetParameters() and IDbCommand:Parameter and not to fill the latter one with the command text's parameters (see partial source code below).
    public sealed class MyCommand : IDbCommand, IDbCommandAnalysis  
        private string m_commandText;  
        private MyParameterCollection m_parameters = new MyParameterCollection();  
        public string CommandText  
            get { return this.m_commandText; }  
            set { SetCommandText(value); }  
        public IDataParameterCollection Parameters  
            get { return m_parameters; }  
        public IDataParameter CreateParameter()  
            return new MyParameter();  
        public IDataParameterCollection GetParameters()  
            return GetParameterCollection();  
        private void SetCommandText(string commandText)  
            m_commandText = commandText;  
        private MyParameterCollection GetParameterCollection()  
            // ... create parameter collection based on m_commandText  
    Still there are some open questions:
    How can I update the report parameter's properties? Now the parameter's data type is always string, but I'd like to assign the right data type. I'd also like to tag the parameters as hidden and supply a non-queried default value. Is this possible? How?
    Although I my implementation of IDbCommandAnalysis:GetParameters() returns a collection containing the parameter names and values the "prompt query parameters" dialog (after pressing the query designer's execute button in the toolbar) shows a "<blank>" value for the parameters. What could be the problem?
    Thanx in advance and regards,
    Gerald

  • Text Data Processing 4.1 Training in February & April for APJ & EMEA

    We would like to announce upcoming training opportunities in Malaysia and Germany for partners and select customers on Text Data Processing within the upcoming Data Services 4.1. Get early hands-on access to software and participate in the review of the product roadmap.
    Why Partner Test?
    - Receive an introduction to the product and how text analytics capabilities are being delivered through Data Services
    - Only source for partners and select customers to get deep product knowledge directly from development at this early stage
    - Better understand migration scenarios for BusinessObjects Text Analysis XI customers
    Test Scope
    - Entity Extraction from unstructured data in one or more of the 31 supported languages
    - Data Quality transforms for post-processing such as Match and Address Cleanse
    - Sentiment analysis in either English, German, French or Spanish
    Time Frame & Location
    - Kuala Lumpur, Malaysia: February 13-16  >>> FULL <<<
    - Walldorf, Germany: April 16-17 and 18-20  >>> FULL <<<
    Tester Profile
    - Data Services knowledge helpful
    - Previous experience with text analytics or text mining
    - Existing use cases for text analysis and sample data to test
    Tester Demand
    - 2-4 days depending on session
    - No registration fee but attendees must provide their own travel and lodging
    For further Text Data Processing info: http://wiki.sdn.sap.com/wiki/display/BOBJ/TextDataProcessing
    If you are interested in attending or have questions please contact Anthony Waite @ SAP.
    Edited by: Anthony Waite on Feb 7, 2012 4:58 PM

    Did you tried this document:
    How to find the Missing Records for the Target Table during Post Load validation at Business Objects Data Services - Bus…

  • New tutorials posted for Text Data Processing on Data Services 4.0

    Check out the 3 new Text Data Processing tutorials available at http://wiki.sdn.sap.com/wiki/display/BOBJ/TextDataProcessing within the Product Tutorials section.
    -- Introduction to Entity Extraction Transform: this 15-minute demo provides an overview of the Text Data Processing Entity Extraction transform for Data Services 4.0. The Entity Extraction transform enables you to process unstructured text, such as web pages, news articles, maintenance logs, text from a spreadsheet, or even from a database column, to extract key pieces of information that you can use in query and reporting.
    -- Using Text Data Processing Blueprints: this 6-minute demo explains how to get up and running with TDP quickly using a series of jobs contained in a blueprint. A blueprint is a sample end-to-end solution available on SAP Community Network.
    -- Creating an Extraction Dictionary: this 13-minute demo explains how to create an Extraction Dictionary, which can be used to customize a TDP Entity Extraction transform to improve results.

    Thanks for sharing your use case.  I have a few questions if you don't mind:
    1) What file formats are typically found within these compressed containers?  For example, html, xml, txt, pdf, MS Office, etc.
    2) Do these compressed containers ever have other embedded compressed containers within them?  For example, a zip file containing other zip files within it.
    3) If the intention is to process document files, do any of the document files have other document formats embedded/nested within them?  For example, a MS Word document with a spreadsheet embedded within it.

  • Custom Data Processing Extension, use in SSRS Report Properties - References

    I've built a Custom Data Processing Extension (CDPE) and registered it
    successfully (ie. it shows up in the new datasources dialog/drop-down and saves just fine, for VS2010-2014). It is intended to be a custom (XML-based) DataSource. However, based on the "nature of the beast", I also need to have a Custom Query
    Designer (CQD) for development  testing of the CDPE.
    Here are the errors I get for the CQD:
    Pulling a report up in "Report Preview", which is wired to the CDPE->CQD, I get:
    "An error occurred during local report processing. The definition of the report '/TestDS' is invalid. Error while loading code module: 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. Details:
    Could not load file or assembly 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken 89845dcd8080cc91' or one of it's dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception
    from HREULT: 0x80131040)"
    My CDPE directly includes Microsoft.ReportingServices.Interfaces.dll AND matches everything as far as version and key. It also includes Microsoft.ReportingServices.QueryDesigners.dll required for the CQD.
    I've written other WORKING CDPEs but not one with a CQD (Query Designer Custom replacement in Visual Studio). All the references from what I can tell are OK. I think CQDs are screwed up for XML datasources. The interfaces are not right.
    (will explain further on)
    From the "Data Sources", Dataset Properties, I click on the "Query Designer", I get:
    "An error occurred while loading the query designer 'DATASET' (which is the name of the CDPE). Query Designer: Object reference not set to an instance of an object."
    I "think" XML type CDPEs are trying to execute a web services call, versus working properly/CORRECTLY with a
    text-based query for XML. The reason I say this is that I've created both WinForm and WebForm test harnesses. They both come up with this error: "...Failed to prepare web request for the specified URL. (rsXmlDataProviderError), Invalid
    URI: The URI is empty." (which is nonsense, there is no request, the query is simply text/file-based stuff, and I read locally ALL of the XML data expected for testing without issue -> I'm ONLY making the CDPE XML-based because I have custom
    WCF calls which already work). (If you really want to understand overall architecture, please see my post: http://social.msdn.microsoft.com/Forums/en-US/d15d9206-95d7-473a-a7f9-a38b4279de8c/ssrs-extension-which-to-use?forum=sqlreportingservices&prof=required
    Other than "100 mile" overviews from Microsoft, this has got to be some of the worst documented stuff I've ever seen (
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner.aspx ). Remote Debugging it doesn't work 95% of the time.
    My environment is VS2013 Ultimate with BI and SQL Server 2012 SP1.
    Thanks Rob
    Rob K

    Update:
    I can now see the Custom Query Designer and get anticipated results (after some fooling around with different combinations).
    Here's how things were broken by the MS SQL Server 2012 product/release team:
    1. they upgraded to .Net v4.x here (to support SharePoint, AX, MS Data Tools, etc.)
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.QueryDesigners.dll
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.Interfaces.dll
    2. they left c:\Program Files\Microsoft SQL Server\MSRS.11.MSSQLSERVER\Reporting Services\ReportServer\bin\Microsoft.ReportingServices.Interfaces.dll at .Net
    v2.x
    3. they don't support Custom Extensions (which use a Query Designer) with anything higher  than .Net v3.5
    In my case, I had to segregate:
    a. Report Definition Custom Extension to v4.5
    b. Custom Data Processing Extension to v3.5
    c. Custom Query Designer to v4.x
    d. my WCF/SSO to v4.5.1.
    #2 and #3 above, in my humble opinion are simply dead wrong as far as what you ever want to do in the release cycle (I can see there being an early/first release exception(s), but 2 years out and a successor product (2014) should have meant that this
    was rectified more than a year ago.)
    Whomever failed to get this communicated in the 2012 documentation created even more havoc than an average developer can decipher:
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner(v=sql.110).aspx
    (I'm still working on how to get the remote debugger working consistently.)
    Rob

  • Calling a Stored Procedure using SSRS Custom Data Processing Extension

    I need SSRS Custom Data Processing Extension to call a stored procedure for my ssrs report. I refered many links regarding this, but i cannot find it. Instead of that there are examples for Data processing extensions that uses XML files and also multiple
    data sources.
    I want Data Processing Extension to call a stored procedure.
    Please Help. Thanks in advance

    Sorry why do you need a Data Processing Extension for that? Cant you directly call the procedure
    from SSRS dataset? Whats the RDBMS which holds this procedure?
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for

  • Export pdf with visibility control

    ik have a pdf with layers placed as an object in my CS5.5 document now i want to export pdf's with control over the visability of the layers of that pdf-object i can set the visability but, exporting gives 1 failure how can i solve this

  • Anyone having problem with the iCloud lately

    anyone having problems with iCloud sending from iPad to windows computer within the last week?

  • Forms from NT to TRU 64 unix on compaq

    Dear all, I have tried to compile the forms in TRU 64 unix on Compaq (ES-40) which were originally developed in WIN-NT (Developer 6.0) Simple re-compile didn't help me.The tool bar icons which appeared in WIN-NT did not appear after compilation. Also

  • Dreamweaver CS3 and Windows 8 Issues

    I have Dreamwever CS3 and Windows 8 and I keep getting errors that can only be resolved by closing down DW and losing the work I have processed so far. I happens mostly when I use extentions such as Webassaist Data Bridge. I have visited their forum

  • Any way to mass delete bins ?

    Hi Experts, Since we need move warehouse, We will setup new bins and delete the old ones. May I just check if there is way to mass delete bins ? Thanks ALice