Checkbox values dont submit when loaded from .fdf file

Hello,
     I have created a form using LiveCycle.  The form is part of a web application that allows users to edit and save forms using .fdf data.  The user gets a .fdf file which grabs its template from my web server.  The fdf fills in the saved data fine, but when they submit, checkbox values are not sent correctly.  If the user manually unchecks and checks the box, the data is sent, but if they dont change it at all, leaving the box checked, the value is not submitted.  The form submits using an http submit button pointed to my server.   Everything comes over correctly, except the checkboxes.
Any help would be greatly appreciated.

Do you have the PDF and FDF posted anywhere that we could look at? I've used a /F tag pointing to an HTTP location many times ... it does work.

Similar Messages

  • Error when loading from Flat File

    I get the message below when trying to load from a flat file.  The change that I made to the Transfer Structure was to move the items.  How can I resolve? Thanks
    Error 8 when compiling the upload program: row 227, message: Data type /BIC/CCABSMMIMH1 was found in a newer ve

    hi Niten,
    check if helps
    A newer version of the data type error when loading

  • Component Slider not working when loaded from another swf file

    I have a simple Flash application that uses a component Slider to increase or decrease the size of the text in a TextArea (ta). It works perfectly fine on its own, however, when I try to load the same swf file from another application, I get the following error...
    ReferenceError: Error #1069: Property fl.managers:IFocusManager::form not found
    on fl.managers.FocusManager and there is no default value.
    at fl.controls::Slider/thumbPressHandler()
    Code...
    import fl.events.*;
    import flash.text.TextFormat;
    ta.text = "Lorem ipsum dolor sit amet";
    var tf:TextFormat = new TextFormat();
    tf.color = 0xCCCCCC;
    tf.font = "Trebuchet MS";
    tf.size = 12;
    slider.addEventListener(SliderEvent.THUMB_DRAG, sliderChange);
    style();
    function style():void
        ta.setStyle("textFormat", tf);
    function sliderChange(e:SliderEvent):void
        tf.size = slider.value;
        ta.setStyle("textFormat", tf);
    Could the containing swf file that I'm loading the slider swf file in anyway effect the slider application? I don't quite understand why it works on its own, but not when loaded from another app.

    add a slider component to the main (loading) swf's library.

  • Chinese characters scrambled when loading from DS to BW

    Hi, I've been pulling my hair out with this issue.
    I have a flat file containing Chinese text. When I load this in BW using 'FLATFILE' as a source system, it works fine. BW shows the correct Chinese characters.
    When I do the same load using BODI, I get funny characters.
    When I use BODI to load from one flat file into another flat file, the Chinese characters remain correct.
    What do I need to do to make sure I get the right Chinese characters in BW when loading from BODI?
    BODI is installed on Unix on Oracle 10.
    I run the jobs as batch processes.
    The dsconfig.txt has got:
    AL_Engine=<default>_<default>.<default>
    There are no locale settings in al_env.sh
    BW target is UTF-8 codepage.
    File codepage is BIG5-HKSCS
    BODI is set up as a Unicode system in SAP BW.
    When loading flat file to flat file, I get a message:
    DATAFLOW: The specified locale <eng_gb.iso-8859-1> has been coerced to <Unicode (UTF-16)
    because the datastore <TWIN_FF_CUSTOMER_LOCAL> obtains data in <BIG5-HKSCS> codepage.
    JOB: Initializing transcoder for datastore <TWIN_FF_CUSTOMER_LOCAL> to transcode between
    engine codepage<Unicode (UTF-16)>  and datastore codepage <BIG5-HKSCS>
    When loading to BW the messages are almost the same, but now the last step in UTF-16 to UTF-8.
    I read the wiki post which definitely helped me to understand the rationale behind code page, but now I ran out of ideas what else to check ( http://wiki.sdn.sap.com/wiki/display/BOBJ/Multiple+Codepages )
    Any help would be greatly appreciated.
    Jan.

    Hi all. Thanks for the Inputs. This is what I got when I clicked on the Details Tab of the Monitor....
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data packages or InfoPackages are missing in BI but there were no apparent processing errors in the source system. It is therefore probable that there was an error in the data transfer.
    The analysis tried to read the ALE outbox of the source system. This lead to error .
    It is possible that there is no connection to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection to the source system for errors and check the authorizations and profiles of the remote user in both the BI and source systems.
    Check th ALE outbox of the source system for IDocs that have not been updated.

  • Anyone Locked a generated SO by item/header when loading from ORDERS05?

    Has anyone managed to sucessfully locked a generated Sales Order when Loading from an IDOC of type ORDERS05?. I create a lock at the item level if necessary, but for some reason it zeroises the item Qty, almost as though there was no Schedule segment supplied, which there is.
    Anyone hit this same problem and overcome it?.
    Blue

    This is in reply to the first post. I don't know what happened after.
    Caused by: java.security.AccessControlException: access denied (java.util.PropertyPermission sun.arch.data.model read)
         at java.security.AccessControlContext.checkPermission(Unknown Source)
         at java.security.AccessController.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkPropertyAccess(Unknown Source)
         at java.lang.System.getProperty(Unknown Source)
         at org.eclipse.swt.internal.Library.loadLibrary(Library.java:167)
         at org.eclipse.swt.internal.Library.loadLibrary(Library.java:151)
         at org.eclipse.swt.internal.C.<clinit>(C.java:21)
    If you read the above trace from bottom to top, it shows none of you classes, only classes from that Eclipse library, which seems to loadLibrary() a native DLL. In order to do this, it needs to call System.getProperty( "sun.arch.data.model" ). This call is not allowed from un unsigned applet. So I guess you need to sign the applet and this problem will go away. Many other problems may follow. Just read very very carefully all the related documentation, which I did not.

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • Cube creation & Data loading from Flat file

    Hi All,
    I am new to BI 7. Trying to create a cube and load data from a flat file.
    Successfully created the infosource and Cube (used infosource as a template for cube)
    But got stucked at that point.
    I need help on how to create transfer rules/update rules and then load data into it.
    Thanks,
    Praveen.

    Hi
    right click on infosource->additional functions->create transfer rules.
    now in the window insert the fields you want to load from flat file->activate it.
    now right click on the cube->additional functions->create update rules->activate it.
    click on the small arrow on the left and when you reach the last node(DS)
    right click on it->create info package->extenal data tab->give your FLAT file path and select csv format->schedule tab->click on start.
    hope it helps..
    cheers.

  • I can't enter any values in JTextField after loading the flash files.

    Hi to all,
    In my application I have two panels. One panel has JTextField and another panel i loaded flash files.
    The flash files are build by CS4. I'm using JDIC to load the flash files. While open the application i loaded flash file and I try to enter any input in the textfiled, but I can't. After minimize and maximize the application I can enter the values. This problem happens only after I change the jre version from 1.6 to 1.7.
    While loading flash files I'm getting the below exception.
    org.jdesktop.jdic.init.JdicInitException: java.io.IOException: The filename, directory name, or volume label syntax is incorrect
         at org.jdesktop.jdic.init.JdicManager.initBrowserNative(Unknown Source)
         at org.jdesktop.jdic.browser.WebBrowser.<clinit>(Unknown Source)
    pls give me a solution.
    Thanks in advance...:)

    I can't enter any values in JTextField after loading the flash files., locking.

  • Tracking history while loading from flat file

    Dear Experts
    I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
    the data will be like this and it keep on changes and the key is P1,A1,C1.
    DAY 1---P1,A1,C1,100,120,100
    DAY 2-- P1,A1,C1,125,123,190
    DAY 3-- P1, A1, C1, 134,111,135
    DAY 4-- P1,A1,C1,888,234,129
    I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
    Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
    Thanks and regards
    Neel
    Message was edited by:
            Neel Kamal

    Hi
    You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
    Consider loading to a write-optimized DataStore object, to strore the data.  That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes.  In addition, load to a standard DataStore object which will track the changes in the change log.  Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
    Thanks for any points you choose to assign
    Best Regards -
    Ron Silberstein
    SAP

  • CUNIT error in data loading from flat file after r/3 extraction

    Hi all
    After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?

    check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
    BK

  • MARS - Load From Seed File fails

    Hi, everybody.
    I have problem with importing larger number of devices into MARS using Load from Seed File funcionality. I successfully connect to FTP server but I receive following error:
    Status: Errors occured while retrieving csv file from ftp server. sed: can't read /tmp/mars.csv: No such file or directory while executing "exec sed -i "s/\15//g" /tmp/$fileName" (file "./ftpconfig.exp" line 243)
    Any ideas how to resolve it ?
    Thanks in advance
    Marko

    Hello,
    It seems to me that if you have any of the following characters in SNMP string "/\.*[]^$" it could make problems to sed which is parsing the file. Try to change SNMP strings to avoid those characters.
    BR,
    Marko

  • Problem when loading from ODS to the CUBE

    Hi Experts,
    I am facing an unusual problem when loading data from ODS to the cube.
    I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
    I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
    I am sure when I run a full load the data goes from Active table of the ODS.
    After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
    I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
    I also dont have any fancy routines in the update rules.
    Please help me in this regard.
    Regards
    Raghu

    Hi,
    Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
    o     First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
    o     First select info objects option and create info area then create info object catalog then char, & key figure. Then create ‘id’ in Char, Name as attribute activate it then in key figures create no and activate it.
    o     In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
    o     For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
    o     Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
    o     Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
    o     Then other screen opens there we can see if we doesn’t able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
    o     so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
    o     Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
    o     Once it is green the click the data target and see the file executed.
    o     Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
    o     Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
    o     Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
    o     In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
    o     Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
    o     Then select monitor option then the contents then the field to be elected then the file to be executed.
    regards
    ashwin

  • Error when loading from External Tables in OWB 11g

    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.

    user1130292 wrote:
    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in  tags
    also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO?

  • BPC 7.5: Delta Load when loading from BI InfoProvider

    Hi,
    in BPC 7.5 running a package based on Process Chain "CPMB/LOAD_INFOPROVIDER" loads data directly from an SAP BI Infoprovider into an BPC-Cube. According to the options you can choose "Merge Data Values" or "Replace & Clear DataValues"
    According to the description the first one "Imports all records, leaving all remaining records in the destination intact", the second one "Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records".
    I tried both and both (!!) result in what you would expect of "Replace & Clear". Both firstly create storno-records in the cube and then add the new values.
    Is this an error or wanted behaviour? I didn't find any SAP Notes on this topic, but doubt it's right....
    Is there any way to achieve an Merge or - even better - a delta load?
    Thanks a lot for any input given.
    bate

    Hi Bate,
    It is indeed a bit confusing. I'll translate the BPC-speak for you
    Replace & Clear:
    1. Look up all CATEGORY/ENTITY/TIME combinations in the incoming data
    2. Clear all records in application with the same CATEGORY/ENTITY/TIME combinations that exist in the incoming data records
    3. Load incoming data to the cube
    Merge:
    1. Load incoming data to the cube record-by-record, overwriting existing data.
    This means that "Replace & Clear" might clear out existing records in the cube that do not share the full key of incoming records but do share the CATEGORY/ENTITY/TIME dimension values. For example, a records with CATEGORY/ENTITY/TIME/ACCOUNT values of ACTUAL/1000/2010.JAN/ACCT1 would be deleted when loading an incoming record with values ACTUAL/1000/2010.JAN/ACCT2, if you use the "Replace & Clear" method, but would not be deleted if you use the "Merge" method.
    There is no option to load data additively, like it is loaded to a BW InfoCube.
    Cheers,
    Ethan

  • How to Prevent CacheStore from Getting Called when Loading from DB

    Hi,
    I have a Cache with WriteBehind enabled. The issue is when I'm initializing the Cache from its Persistence Store (SQL Server 2005) I dont want it to call its CacheStore implementation. In the coherence book written by Alexander Seovic it recommends using another Cache to control writing to different Caches, sort of like a global flag, but that will just work in a WriteThrough scenario and not in a WriteBehind. One of my theories is to use a MapTrigger in which when i'm loading from the db I intercept the call and tell the object not write to the DB, maybe through writing directly to the Backing Map though i'm not sure if writing to the BackingMap prevents the Calling of the CacheStore. Please let me know. Tks.

    Hi user13402724,
    The documentation covers a scenario like this here: http://download.oracle.com/docs/cd/E14526_01/coh.350/e14509/appsampcachestore.htm#sthref512
    JK

Maybe you are looking for

  • Table for Invoice release date

    Hello, Can anyone tell me the table name in which I can get the Invoice release date (Invoice was blocked previously and released with MRBR). I want the date at which it is released through MRBR. Thanks, Input

  • Spry Horizontal Menu Bar Shift Error in IE

    My horizontal submenus keep shifting to the right in Internet Explorer. See here: home test . I've downloaded and replaced the js with this "fix" ( http://labs.adobe.com/technologies/spry/widgets/menubar/SpryMenuBar.js) but it's still not working. Th

  • How get report of short text in tcode MCS3N ?

    Hello Experts, I need to provide report to the client listing Materials and corresponding bathes with short text (DFBATCH-KZTXT) and Shelf Life Exp. Date I understand from community that I can Use FM 'READ_TEXT' to get this values but the problem is

  • MONTH END CLOSING ACTIVITY

    Dear all pl, explain few questions 1) What r the month end closing activities done in SAP PP? 2) How do u capture product cost? 3) What is Teco, closing a prod. orderand order settlement? thanks Raj

  • Auto-Login Not Working!

    So, I've figured out that the auto-login doesn't work if I have a password with numbers or symbols. It works just fine if my password has only letters. What gives?