What is the proper process to enable LDAP?

can someone point me to the proper place?
Thanks,

Yes, I try to use 3rd party ldap for security in my web application. (form based)
I make changes according to chapter 9 - External LDAP Security Providers and receive error message after I restart the OC4J server:
Starting OC4J from D:\oracle\jdeveloper\j2ee\home ...
06/09/13 07:58:21 WARNING: Application.setConfig Application: MyApp is in failed
state as initialization failedjava.lang.InstantiationException
Sep 13, 2006 7:58:21 AM com.evermind.server.Application setConfig
WARNING: Application: MyApp is in failed state as initialization failedjava.lang
.InstantiationException
2006-09-13 07:58:21.910 WARNING J2EE 0JR0013 Exception initializing deployed app
lication: MyApp. null
2006-09-13 07:58:25.496 NOTIFICATION JMS Router is initiating ...
2006-09-13 07:58:27.238 ERROR J2EE HTTP0004 Internal error raised tyring to inst
antiate web-application: MyWeb defined in web site OC4J 10g (10.1.3) Default Web
Site. Application: MyWeb has been stopped
06/09/13 07:58:27 Oracle Containers for J2EE 10g (10.1.3.0.0) initialized
This web app works under jazn.com realms (xml file security method).
Thanks,

Similar Messages

  • What would the best process chain look like for this MD data load scenario?

    Hi there,
    I have the following setup. SAP BW connection to external system via DBConnect (DB2 database).
    We have 73 master data text data sources to load either once a week or everyday through process chain. We have not decided on the exact schedule yet.
    All the master data text DataSources pull data from the SAME VIEW created on the DB2 external system:
    VWDEDMASTERDATA
    The structure of the view is the following:
    DEDNAME
    DEDNAMETYPE
    DEDNAMECODE
    DEDNAMEDESC
    DEDNAMELONGDESC
    So, everytime master data text is extracted for all the 73 objects it queries the same view over and over again VWDEDMASTERDATA. We only differentiate in the datasource on the FIELDS tab which InfoObject should DEDNAME map to and then in InfoPackage we filter on the exact object/field to query.
    So essentially, every time we run InfoPackage for master data text object the external system gets queried in the following way:
    SELECT * from VWDEDMASTERDATA where DEDNAME = [FIELD/InfoObject specified in InfoPackage]
    So, if I have to have all 73 objects loaded, essentially the same SQL statement has to execute 73 times. In this scenario, what would be the proper process chain setup that has perfomance in mind? How should the InfoPackages be arranged, in parallel, sequentially, how many in a row, etc?
    Let me know if you need more information.
    thanks

    They would like to send the letters to me
    Depending upon how they send the letters to you and how they expect Acrobat to convert them you could be bordering on a license violation that prohibits you from using your version of Acrobat as a server-based product.
    Adobe offers server based products to convert rtf files into pdf files. Some require your company to run a server, but one seams exactly what they may want. The createpdf service is run by Adobe and seems capable of doing what your company wants.
    https://www.acrobat.com/createpdf/en/home.html

  • What is the proper role of OLAP meta-data?

    One basic feature continues to confuse me with regards to the Oracle OLAP option. What is the proper role and/or purpose of creating OLAP meta-data? Since you can now use the Analytic Workspace Manager to create multi-dimensional OLAP objects, doesn't that now create all the neccesary meta-data in the Analytic Workspace itself?
    We have created a fully functioning prototype system with both multi-dimensional cubes as well as relational tables using both Oracle Discoverer for OLAP and Oracle BI Discoverer to view that data. And we have done so without EVER taking the explicit steps to create OLAP meta-data, i.e. using the CWM or CWM2 packages. Am I missing something? Is it still neccesary to use these packages and if so what are they for? Are there any Oracle OLAP experts that would care to comment?

    Most implementations of the OLAP option are analytic workspaces (MOLAP). The usual motivators towards MOLAP are performance of builds/aggregations and query and additional calculation capabilities offered by the multidimensional engine. Also, the administrative infrastructure is better with MOLAP (because that's what most users use).
    Sometimes customers want to do a ROLAP implementation because they're more familiar with tables and feel comfortable about managing security, high availability and disaster recovering. Also, some customer want to access the data with SQL based tools. But, remember with Oracle OLAP the data is all in the database (safe, secure, same management practices, RAC/Grid enabled, etc) and you can query it with SQL. Those are some of the things that make Oracle OLAP so terrific as compared to stand alone OLAP servers.
    That doesn't necessarily mean that MOLAP is correct for every application, but MOLAP is probably used more than 90% of the time.
    If you would like a white paper discussing when to use MOLAP, feel free to send me at email <[email protected]>

  • What is the Proper way to nullify the VECTOR after it's scope is over

    I am using Vectors and Array lists at many places in my Web Application, It is neccessary to use them.
    In some processes I m storing bulk amount of data into vector due to that the performance of my application will be decreased, for that I have to nullify the vector after it's scope is over.
    To nullify I m using Vector v = new Vector()
    v.clear().
    The above method is suitable in case of simple object data like strings and other values.
    But I wanna know that If I m using HashMap and storing bulk data in it and then I m storing each HashMap into vector, what is the proper way.
    Does I have to iterate each object of HashMap from vector and set them as null and then set vector as null or directly I can use v.clear() method??
    If any having any answer regarding my question then plz reply your each valuable reply will be appriciable.
    Thanks in advance......!!

    JBOSS2000 wrote:
    Each time in loop a new object of vector is created and each time I m nullifying it. Thats what I m doing.
    Thats why I m nullifying it.
    Even if I'll declare it out side the loop then also for the each iteration I have to nullify it cause what I m doing is I m inserting the data into database in each iteration of loop, So that I think it is must to nullify the objects each time.If it is constructed inside the loop then you do not have to nullify it. If it is constructed outside of the loop and you want to empty it for each iteration then just clear() it.

  • What is the proper way to publish standard proxies/services using PI?

    Hello experts,
    Iu2019m working on a Project where we have an ECC 6.0 and we need to publish some standard services to external third-party applications. We found that the functionality that these third-party applications need is accomplished by the service MaintenanceRequestCreateRequestConfirmation_In of component ESA ECC-SE 603. As we are centralizing all our interfaces in SAP Process Intregration I have installed the ESA ECC-SE 603 XI content in SAP PI. After the installation, in tr. SPROXY transaction of the ECC system I can see the proxy properly implemented and ready to be called. And now I have to configure the interface in Integration Directoy. What I wonder is whatu2019s the best practice to configure this in PI? As I can see in ESR the service category is Inbound but, if Iu2019m not wrong, I have to publish a Outbound service? What should I do? Copy the entire definition to my own Software Component and create my own Outbound service? What is the proper way to publish standard proxies/services using PI?
    Thank you in advance.

    Using the Page Items to Submit property on the report region, the Refresh action is all that should be needed. I created a copy of page 2 on page 3, removed the Set Value action, and it is working as expected. I don't have access to 4.2.1 to try it on that specific patch level, but I've been using Page Items to Submit with a Refresh DA since 4.2.0 with no problems.

  • What is the proper way to do Color Space Defaults?

    What is the proper way to do Color Space Defaults?   let the color checker chart fill the whole frame or just put the  color checker chart in  front or aside items or people we are gonna shoot?

    I"m not sure how to use this through Direct Link ... as the timeline including all color-space things are locked down. Perhaps going through Sg in a "native" launch with some of the footage, though it would have to be in a codec Sg takes in "native" ... then doing the color chart thing, creating a Look from it, and then applying that in Direct Link sessions might be the process ... but yea, Dennis Weinmann, some advice here?
    Thanks!
    Neil

  • How can I use 2 Apple IDs in Itunes? I have 2 IOS Devices. They each have there own AppleID. What is the proper way to sync both of them to Itunes?

    How can I use 2 Apple IDs in Itunes? I have 2 IOS Devices. They each have there own AppleID. What is the proper way to sync both of them to Itunes? I wanted my teenager's AppleID to be different from mine so that she couldn't charge stuff to my AppleID, therefore I created me another one. Now when I go to Sync either device, it tells me that this IOS device can only be synced with one AppleID. Then I get a message to erase it, not going to do that, lol. If I logout as one ID and login as the other, will it still retain all synced information on the PC from the first IOS device? If I can just log in out of the AppleID, then I have no problem doing that as long as the synced apps, music, etc stays there for both. I am not trying to copy from one to the other, just want to make sure I have a backup for the UhOh times. If logging in and out on the same PC of multiple AppleIDs is acceptible then I need to be able to authorize the PC for both devices. Thanks for the help. I am new to the iOS world.

    "Method Three
    Create a separate iTunes library for each device. Note:It is important that you make a new iTunes Library file. Do not justmake a copy of your existing iTunes Library file. If iTunes is open,quit it.
    This one may work. I searched and searched on the website for something like this, I guess I just didn't search correctly, lol. I will give this a try later. My daughter is not be back for a few weekends, therefore I will have to try the Method 3 when she comes back for the weekend again. "
    I forgot to mention that she has a PC at her house that she also syncs to. Would this cause a problem. I am already getting that pop up saying that the iPod is synced to another library (even though she is signed in with her Apple ID to iTunes) and gives the pop up to Cancel, Erase & Sync, or Transfer Purchases. My question arose because she clicked on "Erase & Sync" by mistake when she plugged the iPod to her PC the first time. When the iPod was purchased and setup, it was synced to my PC first. When she went home, she hooked it up to her PC and then she erased it by accident. I was able to restore all the missing stuff yesterday using my PC. However, even after doing that it still told me the next time I hooked it up last night that the iPod was currently synced with a different library. Hopefully, you can help me understand all this. She wants to sync her iPod and also backup her iPod at both places. Both PCs have been authorised. Thanks

  • What is the proper way to record line numbers in Master/Detail records?

    Guys and Gals,
    Been thinking about this for awhile, but thought it best to ask the people who really know what they are doing.
    What is the proper way to record & show line numbers in a Master / Detail record set?
    For example, take Master/Detail relationship Orders and OrderItems. Orders has a column Document_Number and OrderItems has Document_Number, Line_Number. Line_Number should contain the row number 1,2,3,4 ... etc. for each row in a document.
    Should I ...
    <ol><li>Add a sequence and a trigger in the database? The FusionOrderDemo does this, but then the sequence never "resets" and I've got row numbers that keep incrementing. So one document has rows 4,5,6 and the next document has 7,8,9 when they should both have 1,2,3.</li>
    <li>Programmatically take care of the row numbers? This seems like I'm asking for trouble. Anytime an insert or delete operation gets done, I'll have to iterate through rows and re-assign row numbers.</li>
    <li>Is there a way to assign row numbers in a table iterator (or data collection?) to an entity?</li></ol>
    Any suggestions would be appreciated. It's looking like #2 is my only option, but if anyone knows different I'd love the input.
    Will

    Thank you both guys.
    As John said, I believe I'm looking for a gap-free sequence per master record.
    The line number of the OrderItems table is the second half of the primary key. The first half of the primary key (DocumentNumber) is the foreign key to the Orders table.
    Think of it like line items on an order or invoice. For example, if you were talking to someone on the phone concerning an invoice, you might say, "The pricing for line item #3 is incorrect." In this case, it's good to have a common reference. Or imagine a Microsoft Excel spreadsheet with no row numbers displayed! You'd never get anywhere if you had to explain something over the phone.
    If this is tricky to perform, I take it using a sequence and trigger such as the Fusion Order Demo is the best way to approach the challenge for simplicity's sake?
    Will

  • Need to convert mini dvi to s video to run a school's Hatiachi ceiling mount projector. What is the proper adapter?

    Need to convert mini dvi to s video to run a school's Hatiachi ceiling mount projector. What is the proper adapter?

    Check the projector and see what kind of input is has, most have a VGA connector, so you would need a male to male VGA. You will also need the Mac mini dvi to male VGA (of course).
    Some of the older projetors have a component in only, which is the red,yellow,green, RCA connectors, which are avaialable. (Usually the are fairy short, 5 ft or so,  so you would need to get a VGA with yhe mini DVI to VGA, and a component (the one with the 3 RCA's) you will probaly have to get a Female to Female VGA adapter to connect te VGA to the component.
    And finaally if that weren't enough, the really old projectors have only a composite input. All I can say about that is "the hunt is on"
    Best of luck

  • What is the proper way to close all open sessions of a NI PXI-4110 for a given Device alias?

    I've found that, when programming the NI PXI-4110 that, if a the VI "niDCPower Initialize With Channels VI" (NI-DCPower pallette) is called with a device
    alias that all ready has one or more sessions open (due to an abort or other programming error) a device reference results from the reference out that has a (*) where "*" is post-fixed to the device reference where and is an integer starting that increments with each initialize call. In my clean up, I would like to close all open sessions. For example, let's said the device alias is "NIPower_1" in NI Max, and there are 5 open sessions; NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4). A simple initialize or reset (using niDCPower Initialize With Channels VI, or, niDCPower Initialize With Channels VI, etc.) What is the proper way to close all open sessions?
    Thanks in advance. Been struggleing with this for days!

    When you Initialize a session to a device that already has a session open, NI-DCPower closes the previous session and returns a new one. You can verify this very easily: try to use the first session after the second session was opened.
    Unfortunately, there is a small leak and that is what you encountered: the previous session remains registered with LabVIEW, since we unregister inside the Close VI and this was never called. So the name of the session still shows in the control like you noted: NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4), etc.
    There may be a way to iterate over the registered sessions, but I couldn't find it. However, you can unregister them by calling "IVI Delete Session". Look for it inside "niDCPower Close.vi". If you don't have the list of open sessions, but you have the device name, then you can just append (1), (2) and so forth and call "IVI Delete Session" in a loop. There's no problem calling it on sessions that were never added.
    However - I consider all this a hack. What you should do is write code that does not leak sessions. Anything you open, you should close. If you find yourself in a situation where there are a lot of leaked sessions during development, relaunching LabVIEW will clear it out. If relaunching LabVIEW is too much of an annoyance, then write a VI that does what I described above and run it when needed. You can even make it "smarter" by getting the names of all the NI-DCPower devices in your system using the System Configuration or niModInst APIs.
    Hope this helps.
    Marcos Kirsch
    Principal Software Engineer
    Core Modular Instruments Software
    National Instruments

  • What's the proper protocol for a reset on my ipod touch 4g?  iOS 6 has totally jacked it up and it will no longer do anything but crash, and won't sync with itunes wirelessly or by cable.

    What's the proper protocol for a reset on my ipod touch 4g?  iOS 6 has totally jacked it up and it will no longer do anything but crash, and won't sync with itunes wirelessly or by cable.
    It's a 64G ipod touch and was fine till Apple told me to upgrade to ios 6.  Now most of my apps crash, my music won't play and I just get a white screen when I hit Music.
    When I try to sync to itunes it acts like it's going to sync and appears to recognize the ipod, but it's grayed out and has an update circle by it that spins for a while until itunes eventually freezes alltogether.  Is there a  way to go back to ios 5 after a erase and reset?

    iOS: Unable to update or restore

  • What are the major process to transfer the data from legacy to sap system.

    What are the major process to transfer the data from legacy to sap system using BDC at Real Time only?

    hi,
    BATCH DATA COMMUNICATION
    main methods are:
    1. SESSION METHOD
    2. CALL TRANSACTION
    3. DIRECT INPUT
    Advantages offered by BATCH INPUT method:
    1. Can process large data volumes in batch.
    2. Can be planned and submitted in the background.
    3. No manual interaction is required when data is transferred.
    4. Data integrity is maintained as whatever data is transferred to the table is through transaction. Hence batch input data is submitted to all the checks and validations.
    To implement one of the supported data transfers, you must often write the program that exports the data from your non-SAP system. This program, known as a “data transfer” program must map the data from the external system into the data structure required by the SAP batch input program.
    The batch input program must build all of the input to execute the SAP transaction.
    Two main steps are required:
    • To build an internal table containing every screen and every field to be filled in during the execution of an SAP transaction.
    • To pass the table to SAP for processing.
    Prerequisite for Data Transfer Program
    Writing a Data Transfer Program involves following prerequisites:
    Analyzing data from local file
    Analyzing transaction
    Analyzing transaction involves following steps:
    • The transaction code, if you do not already know it.
    • Which fields require input i.e., mandatory.
    • Which fields can you allow to default to standard values.
    • The names, types, and lengths of the fields that are used by a transaction.
    • Screen number and Name of module pool program behind a particular transaction.
    To analyze a transaction::
    • Start the transaction by menu or by entering the transaction code in the command box.
    (You can determine the transaction name by choosing System – Status.)
    • Step through the transaction, entering the data will be required for processing your batch input data.
    • On each screen, note the program name and screen (dynpro) number.
    (dynpro = dyn + pro. Dyn = screen, pro = number)
    • Display these by choosing System – Status. The relevant fields are Program (dynpro) and Dynpro number. If pop-up windows occur during execution, you can get the program name and screen number by pressing F1 on any field or button on the screen.
    The technical info pop-up shows not only the field information but also the program and screen.
    • For each field, check box, and radio button on each screen, press F1 (help) and then choose Technical Info.
    Note the following information:
    - The field name for batch input, which you’ll find in its own box.
    - The length and data type of the field. You can display this information by double clicking on the Data Element field.
    • Find out the identification code for each function (button or menu) that you must execute to process the batch-input data (or to go to new screen).
    Place the cursor on the button or menu entry while holding down the left mouse button. Then press F1.
    In the pop-up window that follows, choose Technical info and note the code that is shown in the Function field.
    You can also run any function that is assigned to a function key by way of the function key number. To display the list of available function keys, click on the right mouse button. Note the key number that is assigned to the functions you want to run.
    Once you have program name, screen number, field name (screen field name), you can start writing.
    DATA TRANSFER program.
    Declaring internal table
    First Integral Table similar to structure like local file.
    Declaring internal table like BDCDATA
    The data from internal table is not transferred directly to database table, it has to go through transaction. You need to pass data to particular screen and to particular screen-field. Data is passed to transaction in particular format, hence there is a need for batch input structure.
    The batch input structure stores the data that is to be entered into SAP system and the actions that are necessary to process the data. The batch input structure is used by all of the batch input methods. You can use the same structure for all types of batch input, regardless of whether you are creating a session in the batch input queue or using CALL TRANSACTION.
    This structure is BDCDATA, which can contain the batch input data for only a single run of a transaction. The typical processing loop in a program is as follows:
    • Create a BDCDATA structure
    • Write the structure out to a session or process it with CALL TRANSACTION USING; and then
    • Create a BDCDATA structure for the next transaction that is to be processed.
    Within a BDCDATA structure, organize the data of screens in a transaction. Each screen that is processed in the course of a transaction must be identified with a BDCDATA record. This record uses the Program, Dynpro, and Dynbegin fields of the structure.
    The screen identifier record is followed by a separate BDCDATA record for each value, to be entered into a field. These records use the FNAM and FVAL fields of the BDCDATA structure. Values to be entered in a field can be any of the following:
    • Data that is entered into screen fields.
    • Function codes that are entered into the command field. Such function codes execute functions in a transaction, such as Save or Enter.
    The BDCDATA structure contains the following fields:
    • PROGRAM: Name of module pool program associated with the screen. Set this field only for the first record for the screen.
    • DYNPRO: Screen Number. Set this field only in the first record for the screen.
    • DYNBEGIN: Indicates the first record for the screen. Set this field to X, only for the first record for the screen. (Reset to ‘ ‘ (blank) for all other records.)
    • FNAM: Field Name. The FNAM field is not case-sensitive.
    • FVAL: Value for the field named in FNAM. The FVAL field is case-sensitive. Values assigned to this field are always padded on the right, if they are less than 132 characters. Values must be in character format.
    Transferring data from local file to internal table
    Data is uploaded to internal table by UPLOAD of WS_UPLOAD function.
    Population of BDCDATA
    For each record of internal table, you need to populate Internal table, which is similar to BDCDATA structure.
    All these five initial steps are necessary for any type of BDC interface.
    DATA TRANSFER program can call SESSION METHOD or CALL TRANSACTION. The initial steps for both the methods are same.
    First step for both the methods is to upload the data to internal table. From Internal Table, the data is transferred to database table by two ways i.e., Session method and Call transaction.
    SESSION METHOD
    About Session method
    In this method you transfer data from internal table to database table through sessions.
    In this method, an ABAP/4 program reads the external data that is to be entered in the SAP System and stores the data in session. A session stores the actions that are required to enter your data using normal SAP transaction i.e., Data is transferred to session which in turn transfers data to database table.
    Session is intermediate step between internal table and database table. Data along with its action is stored in session i.e., data for screen fields, to which screen it is passed, the program name behind it, and how the next screen is processed.
    When the program has finished generating the session, you can run the session to execute the SAP transactions in it. You can either explicitly start and monitor a session or have the session run in the background processing system.
    Unless session is processed, the data is not transferred to database table.
    BDC_OPEN_GROUP
    You create the session through program by BDC_OPEN_GROUP function.
    Parameters to this function are:
    • User Name: User name
    • Group: Name of the session
    • Lock Date: The date on which you want to process the session.
    • Keep: This parameter is passed as ‘X’ when you want to retain session after
    processing it or ‘ ‘ to delete it after processing.
    BDC_INSERT
    This function creates the session & data is transferred to Session.
    Parameters to this function are:
    • Tcode: Transaction Name
    • Dynprotab: BDC Data
    BDC_CLOSE_GROUP
    This function closes the BDC Group. No Parameters.
    Some additional information for session processing
    When the session is generated using the KEEP option within the BDC_OPEN_GROUP, the system always keeps the sessions in the queue, whether it has been processed successfully or not.
    However, if the session is processed, you have to delete it manually. When session processing is completed successfully while KEEP option was not set, it will be removed automatically from the session queue. Log is not removed for that session.
    If the batch-input session is terminated with errors, then it appears in the list of INCORRECT session and it can be processed again. To correct incorrect session, you can analyze the session. The Analysis function allows to determine which screen and value has produced the error. If you find small errors in data, you can correct them interactively, otherwise you need to modify batch input program, which has generated the session or many times even the data file.
    CALL TRANSACTION
    About CALL TRANSACTION
    A technique similar to SESSION method, while batch input is a two-step procedure, Call Transaction does both steps online, one after the other. In this method, you call a transaction from your program by
    Call transaction <tcode> using <BDCTAB>
    Mode <A/N/E>
    Update <S/A>
    Messages into <MSGTAB>.
    Parameter – 1 is transaction code.
    Parameter – 2 is name of BDCTAB table.
    Parameter – 3 here you are specifying mode in which you execute transaction
    A is all screen mode. All the screen of transaction are displayed.
    N is no screen mode. No screen is displayed when you execute the transaction.
    E is error screen. Only those screens are displayed wherein you have error record.
    Parameter – 4 here you are specifying update type by which database table is updated.
    S is for Synchronous update in which if you change data of one table then all the related Tables gets updated. And sy-subrc is returned i.e., sy-subrc is returned for once and all.
    A is for Asynchronous update. When you change data of one table, the sy-subrc is returned. And then updating of other affected tables takes place. So if system fails to update other tables, still sy-subrc returned is 0 (i.e., when first table gets updated).
    Parameter – 5 when you update database table, operation is either successful or unsuccessful or operation is successful with some warning. These messages are stored in internal table, which you specify along with MESSAGE statement. This internal table should be declared like BDCMSGCOLL, a structure available in ABAP/4. It contains the following fields:
    1. Tcode: Transaction code
    2. Dyname: Batch point module name
    3. Dynumb: Batch input Dyn number
    4. Msgtyp: Batch input message type (A/E/W/I/S)
    5. Msgspra: Batch input Lang, id of message
    6. Msgid: Message id
    7. MsgvN: Message variables (N = 1 - 4)
    For each entry, which is updated in database, table message is available in BDCMSGCOLL. As BDCMSGCOLL is structure, you need to declare a internal table which can contain multiple records (unlike structure).
    Steps for CALL TRANSACTION method
    1. Internal table for the data (structure similar to your local file)
    2. BDCTAB like BDCDATA
    3. UPLOAD or WS_UPLOAD function to upload the data from local file to itab. (Considering file is local file)
    4. Loop at itab.
    Populate BDCTAB table.
    Call transaction <tcode> using <BDCTAB>
    Mode <A/N/E>
    Update <S/A>.
    Refresh BDCTAB.
    Endloop.
    (To populate BDCTAB, You need to transfer each and every field)
    The major differences between Session method and Call transaction are as follows:
    SESSION METHOD CALL TRANSACTION
    1. Data is not updated in database table unless Session is processed. Immediate updation in database table.
    2. No sy-subrc is returned. Sy-subrc is returned.
    3. Error log is created for error records. Errors need to be handled explicitly
    4. Updation in database table is always synchronous Updation in database table can be synchronous Or Asynchronous.
    Error Handling in CALL TRANSACTION
    When Session Method updates the records in database table, error records are stored in the log file. In Call transaction there is no such log file available and error record is lost unless handled. Usually you need to give report of all the error records i.e., records which are not inserted or updated in the database table. This can be done by the following method:
    Steps for the error handling in CALL TRANSACTION
    1. Internal table for the data (structure similar to your local file)
    2. BDCTAB like BDCDATA
    3. Internal table BDCMSG like BDCMSGCOLL
    4. Internal table similar to Ist internal table
    (Third and fourth steps are for error handling)
    5. UPLOAD or WS_UPLOAD function to upload the data from the local file to itab. (Considering file is local file)
    6. Loop at itab.
    Populate BDCTAB table.
    Call transaction <tr.code> using <Bdctab>
    Mode <A/N/E>
    Update <S/A>
    Messages <BDCMSG>.
    Perform check.
    Refresh BDCTAB.
    Endloop.
    7 Form check.
    IF sy-subrc <> 0. (Call transaction returns the sy-subrc if updating is not successful).
    Call function Format_message.
    (This function is called to store the message given by system and to display it along with record)
    Append itab2.
    Display the record and message.
    DIRECT INPUT
    About Direct Input
    In contrast to batch input, this technique does not create sessions, but stores the data directly. It does not simulate the online transaction. To enter the data into the corresponding database tables directly, the system calls a number of function modules that execute any necessary checks. In case of errors, the direct input technique provides a restart mechanism. However, to be able to activate the restart mechanism, direct input programs must be executed in the background only. Direct input checks the data thoroughly and then updates the database directly.
    You can start a Direct Input program in two ways;
    Start the program directly
    This is the quickest way to see if the program works with your flat file. This option is possible with all direct input programs. If the program ends abnormally, you will not have any logs telling you what has or has not been posted. To minimize the chance of this happening, always use the check file option for the first run with your flat file. This allows you to detect format errors before transfer.
    Starting the program via the DI administration transaction
    This transaction restarts the processing, if the data transfer program aborts. Since DI document are immediately posted into the SAP D/B, the restart option prevents the duplicate document posting that occurs during a program restart (i.e., without adjusting your flat file).
    Direct input is usually done for standard data like material master, FI accounting document, SD sales order and Classification for which SAP has provided standard programs.
    First time you work with the Direct Input administration program, you will need to do some preparation before you can transfer data:
    - Create variant
    - Define job
    - Start job
    - Restart job
    Common batch input errors
    - The batch input BDCDATA structure tries to assign values to fields which do not exist in the current transaction screen.
    - The screen in the BDCDATA structure does not match the right sequence, or an intermediate screen is missing.
    - On exceptional occasions, the logic flow of batch input session does not exactly match that of manual online processing. Testing the sessions online can discover by this.
    - The BDCDATA structure contains fields, which are longer than the actual definition.
    - Authorization problems.
    RECORDING A BATCH INPUT
    A B recording allows you to record a R/3 transaction and generate a program that contains all screens and field information in the required BDC-DATA format.
    You can either use SHDB transaction for recording or
    SYSTEM ? SERVICES ? BATCH INPUT ? EDIT
    And from here click recording.
    Enter name for the recording.
    (Dates are optional)
    Click recording.
    Enter transaction code.
    Enter.
    Click Save button.
    You finally come to a screen where, you have all the information for each screen including BDC_OKCODE.
    • Click Get Transaction.
    • Return to BI.
    • Click overview.
    • Position the cursor on the just recorded entry and click generate program.
    • Enter program name.
    • Click enter
    The program is generated for the particular transaction.
    BACKGROUND PROCESSING
    Need for Background processing
    When a large volume of data is involved, usually all batch inputs are done in background.
    The R/3 system includes functions that allow users to work non-interactively or offline. The background processing systems handle these functions.
    Non-interactively means that instead of executing the ABAP/4 programs and waiting for an answer, user can submit those programs for execution at a more convenient planned time.
    There are several reasons to submit programs for background execution.
    • The maximum time allowed for online execution should not exceed 300 seconds. User gets TIMEOUT error and an aborted transaction, if time for execution exceeds 300 seconds. To avoid these types of error, you can submit jobs for background processing.
    • You can use the system while your program is executing.
    This does not mean that interactive or online work is not useful. Both type of processing have their own purposes. Online work is the most common one entering business data, displaying information, printing small reports, managing the system and so on. Background jobs are mainly used for the following tasks; to process large amount of data, to execute periodic jobs without human intervention, to run program at a more convenient, planned time other than during normal working hours i.e., Nights or weekends.
    The transaction for background processing is SM36.
    Or
    Tools ? Administration ? Jobs ? Define jobs
    Or
    System ? services ? Jobs
    Components of the background jobs
    A job in Background processing is a series of steps that can be scheduled and step is a program for background processing.
    • Job name. Define the name of assigned to the job. It identifies the job. You can specify up to 32 characters for the name.
    • Job class. Indicates the type of background processing priority assigned to the job.
    The job class determines the priority of a job. The background system admits three types of job classes: A B & C, which correspond to job priority.
    • Job steps. Parameters to be passed for this screen are as follows:
    Program name.
    Variant if it is report program
    Start criteria for the job: Option available for this are as follows:
    Immediate - allows you to start a job immediately.
    Date/Time - allows you to start a job at a specific name.
    After job - you can start a job after a particular job.
    After event - allows you to start a job after a particular event.
    At operation mode - allows you to start a job when the system switches to a particular operation mode.
    Defining Background jobs
    It is two step process: Firstly, you define the job and then release it.
    When users define a job and save it, they are actually scheduling the report i.e., specifying the job components, the steps, the start time.
    When users schedule program for background processing, they are instructing the system to execute an ABAP/4 report or an external program in the background. Scheduled jobs are not executed until they are released. When jobs are released, they are sent for execution to the background processing system at the specified start time. Both scheduling and releasing of jobs require authorizations.
    HANDLING OF POP UP SCREEN IN BDC
    Many times in transaction pop up screen appears and for this screen you don’t pass any record but some indication to system telling it to proceed further. For example: The following screen
    To handle such screen, system has provided a variable called BDC_CURSOR. You pass this variable to BDCDATA and process the screen.
    Usually such screen appears in many transactions, in this case you are just passing information, that YES you want to save the information, that means YES should be clicked. So you are transferring this information to BDCDATA i.e., field name of YES which is usually SPOT_OPTION. Instead of BDC_OKCODE, you are passing BDC_CURSOR.
    BDC_CURSOR is also used to place cursor on particular field.
    A simple transaction where you are entering customer number on first screen and on next screen data is displayed for the particular customer number. Field, which we are changing here, are name and city. When you click on save, the changed record gets saved.
    Prerequisite to write this BDC interface as indicated earlier is:
    1. To find screen number
    2. To find screen field names, type of the field and length of the field.
    3. To find BDC_OKCODE for each screen
    4. Create flat file.
    generally  Batch Input usually are used to transfer large amount of data. For example you are implementing a new SAP project, and of course you will need some data transfer from legacy system to SAP system.
    CALL TRANSACTION is used especially for integration actions between two SAP systems or between different modules. Users sometimes wish to do something like that click a button or an item then SAP would inserts or changes data automatically. Here CALL TRANSACTION should be considered.
    2. Transfer data for multiple transactions usually the Batch Input method is used.
    check these sites for step by step process:
    For BDC:
    http://myweb.dal.ca/hchinni/sap/bdc_home.htm
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/home/bdc&
    http://www.sap-img.com/abap/learning-bdc-programming.htm
    http://www.sapdevelopment.co.uk/bdc/bdchome.htm
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://help.sap.com/saphelp_47x200/helpdata/en/69/c250684ba111d189750000e8322d00/frameset.htm
    http://www.sapbrain.com/TUTORIALS/TECHNICAL/BDC_tutorial.html
    Check these link:
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://www.sap-img.com/abap/question-about-bdc-program.htm
    http://www.itcserver.com/blog/2006/06/30/batch-input-vs-call-transaction/
    http://www.planetsap.com/bdc_main_page.htm
    call Transaction or session method ?
    null

  • What Is The Proper J2EE Architect For This National Defense Project?

    A brief description of a national defense project I am looking into is that the system being developed is supposed to verify the "smartcard" of the card holder and his/her fingerprints against the Defense Department database. The system also reads in information about people (facial features, height, weight, etc.) and stores the information in the database, and then issues new "smartcard".
    1. People involved in the project already decided to use swing for the client tier.
    2. The problem is the Defense Department database. It is a flat file database and this database cannot be touched. It is a black box for Java developers. The Defense Department may provide a jar as an outer layer of this database.
    Question 1: Is the web tier essential for this defense project?
    Question 2: What is the proper J2EE architect for this project?
    Question 3: How to deal with the data problem? Especially it looks that entity beans are not applicable.
    Question 4: How does the system handle very heavy load? It is possible that tens of thousands of people will use the system at the same time.

    A brief description of a national defense project I am
    looking into is that the system being developed is
    supposed to verify the "smartcard" of the card holder
    and his/her fingerprints against the Defense
    Department database. The system also reads in
    information about people (facial features, height,
    weight, etc.) and stores the information in the
    database, and then issues new "smartcard".
    1. People involved in the project already decided to
    use swing for the client tier.
    2. The problem is the Defense Department database. It
    is a flat file database and this database cannot be
    touched. It is a black box for Java developers. The
    Defense Department may provide a jar as an outer layer
    of this database.
    Question 1: Is the web tier essential for this defense
    project?
    Question 2: What is the proper J2EE architect for this
    project?
    Question 3: How to deal with the data problem?
    Especially it looks that entity beans are not
    applicable.
    Question 4: How does the system handle very heavy
    load? It is possible that tens of thousands of people
    will use the system at the same time.
    Hi,
    You can get it done here :http://www.thesoftwareobjects.com

  • What are the oracle processes involved in Data Guard Operation

    Hi All,
    I have a Primary and secondary physical standby database.
    I want to know what are the oracle processes involved in the synchrnization between primary and secondary.
    Thanks
    Santosh

    The best place to get this information is Data Guard Concepts and Administration Guild
    The link for 10g Release 2
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14239/concepts.htm#i1039416
    The link for 10g Release 1
    http://download-west.oracle.com/docs/cd/B14117_01/server.101/b10823/concepts.htm#1039415
    The link for Oracle 9i
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96653/concepts.htm#1027493

  • What is the proper procedure to backup to multiple external hard drives if one wishes to rotate drives such that you may store one in a fire safe?

    Description:  If I want to back up to two external hard drives using Time Machine, what is the proper procedure to follow such that I could keep one attached to Time Machine for daily backups, and one that I attach monthly such that I can back it up monthly, and then store it in a fire safe.  To do this, what are the exact steps to follow?
    Research in Progress:
    I selected the "?" icon in Time Machine which took me to "Mac Help - Time Machine Preferences" section.  I reviewed this section, including the "Select Backup Disk, Select Disk, Add or Remove Backup Disk."  It says, "If you haven’t set up Time Machine, click Select Backup Disk to set up a backup disk. Repeat the steps for each backup disk you want to set up."  I did add two backup disks.  It goes on to say "If you already set up Time Machine, your backup disks are listed, and information about each backup disk is shown. To select, add, or remove backup disks, click Select Disk or Add or Remove Backup Disk."  Well, what is lacking here is what happens when you "Remove" the backup disk.
    As I have read this section, it raises additional questions:
    What are some cautionary statements.
    What happens after you remove a backup disk?  Will Time Machine recognize it later?
    What is the proper procedure for removing and ejecting one of your external hard drive when you use it with Time Machine?  What happens if you don't "remove" a backup drive and instead just drag it to the "trash can, i.e. eject it" and later reconnect it.?

    Time Machine supports multiple backup drives including network area storage volumes (NAS) as well as physically-connected external backup drives.
    In , System Preferences, Time Machine, add the second drive by clicking Select Disk.
    All available connected drives including connected network drives are displayed.
    Select the additional drives, one at a time.
    Connected Network drives are displayed on the desktop only when actively being used by Time Machine for backup.
    When prompted, select 'use both' when adding the second drive.
    Then choose Select or Remove at the bottom of the list of available devices and network volumes to add additional drives.
    Time Machine then automatically rotates backups among the available designated devices and volumes.
    If the drive is physically not present, Time Machine skips to the next available backup drive.
    To DISMOUNT a drive to take it off-site, it is NOT necessary to remove it from the Time Machine preferences.
    When the drive is NOT in use, dismount it by Command+clicking that Drive icon on your desktop (or in Finder) and then "Eject".
    That's it.
    After 10 days with no backups to a given device, Time Machine does display a message and (as I recall) asks if you want to remove that drive from the rotation.
    You can never be too rich or have too many backups!

Maybe you are looking for

  • For those of you who cannot afford a proper video editing system...

    ...don't buy any prosumer video editing program (including Premiere Pro CS5.5). Here's why: Last night, I ran the PPBM5 test on an Intel i3-2100 system with 4GB of DDR3-1333 RAM, a 500GB OS drive and a 1TB project drive (both drives are 7200 RPM) and

  • Access entities in "cascade"

    Hi, I have those three entities as the following : Airplane(private AirplaneDesigner airplaneDesignerId) -----> AirplaneDesigner(private AirplaneConstructor airplaneConstructorId) -------> AirplaneConstructor(String name) I want to get the name attri

  • How to implement transactions in Tomcat using MySQL

    Hi, I am trying to find a way to implement Transactions in Tomcat 4112 using a DBCP connection pool with MySQL (connector/j 3.0.16). I am developing a web app using the Struts framework and have tried using JOTM for the transaction implementation. Ho

  • Force Extension On Export From FCP

    Is there anyway to force FCP to add the extension when exporting any file from FCP 5.1.4? I know I can have it add it through Batch Export, but it'd save so many human errors if it was automatically there; as well as with all the Export methods. Any

  • How to recover accidentally deleted data in MAC X10.9

    I accidentally deleted my Data from the hard disk drive (around 5000 images) and some documents. Please suggest me a way to recover it back as it is very important and emergency. I do not have Time machine or any other source to back up my data. Plea