Flatfile Issue

Hello,
I am uploading CSV file into BI Application server using WINSCP basically  this tool  FTP the CSV file into BI Application server.
I am trying to load the data into PSA , I am getting below error message
"Cannot convert character sets for one or more characters"
But If I load the same file from "Local Workstation" I am not getting any error messages
I am guessing WINSCP tool is putting some invalid characters while FTPing file.
Where can I see which record is having invalid character ?
This error I am getting while loading into PSA.
Your input will be appreciated

Ram,
Check the format of DATE and NUMC fields in your input file.
When you do a flat file upload from App. server, you need to ensure the field format. If required change it in the flat file and then FTP it to App server.
Load the data again into PSA.
Check and let me know.

Similar Messages

  • Convert character sets for one or more characters

    Hello,
    Issue:
    Source File: Test.csv
    Source file dropped on application server as csv file via FTP
    Total records in csv file 102396
    while loading into PSA  I can able to load only 38,000 records after that I am getting error message " convert character sets for one or more characters" .
    If i load same csv file from local workstation I am not getting any error message I can able to load total 102396 records.
    Anybody have faced this kind of problem please share with me I will assign full points
    Thanks

    Hi,
    check
    Flatfile Issue
    Regards,
    Arek

  • Issue while uploading data from flatfile to Transaction(VK13).

    Hi All,
    I am facing an issue while uploading the data from flatfile to the transaction(vk13). My flat file is as shown below.
    SalesOrganization    DistributionChannel    Customer    Material    Releasestatus    Amount    Currency    Quantity    UOM    ValidFrom    ValidTo
    2000    01    0010029614    AT309739    A    20.00    USD    1    PC    09/11/2014    12/31/9999
    If I upload these data using the RV_CONDITION_COPY  FM it is succesfully uploading to the relevant tables(konh,konp) but in the tcode VK13  I am getting all values fine, except UOM. Instead of PC it is showning as ***. I did not understand why it is happening please give an idea to solve my issue.
    Regards,
    Chakradhar.

    Hi Raymond,
    Thanks for your reply.Yes,If I use CONVERSION_EXIT_CUNIT_INPUT in my program the issue is, Assume If the user is giving PC as value for UOM field in flat file and upload the flat file.It is successfully uploading the value PC to the UOM field in transaction VK13 but the in the database table(konp) it is showing the value as ST.
    Regards,
    Chakradhar.

  • EPMA 11.1.2.1 dimension build from FlatFile to EPMA to EssbaseASO using ODI, member sorting order issues

    We are building our EssbaseASO cube using FlatFiles which are pushed to EPMA via interface tables and then the EPMA Essbase app is deployed to Essbase, this entire job is done through a ODI interface/package. The problem I am facing is the Order in which members in EPMA appear, even though the FlatFile has the right sorting order, by the time the hierarchies arrive into the EPMA interface table the sort order is changed randomly.
    I am using the File to SQL control append IKM in ODI and some where on the way I saw a suggestion to add a new option to insert "ORDER BY" into the IKM. I successfully did this and it did change the sort order but even this is not the right order(order in which my flatfile has the dimensions).
    I can't understand why Oracle/ODI needs to randomize the rows without being asked to do so! Please let me know if anyone has faced this issue before and if they were able to resolve it.

    The EPMA interface tables have a SORTORDER column. Make sure this is populated with numeric values sequencing the order you want your members to appear in the EPMA hierarchies and when you import from the interface tables this order will be respected. Prior to this feature being introduced the workaround was to create a view referencing the interface tables imposing the required ORDER BY clause but this isn't required in 11.1.2.1 just use the SORTORDER column

  • Idoc flatfile to IDOC xml issue with new PI7.11 module SAP_XI_IDOC/IDOCFlat

    Hi,
    I am trying to develop a scenario as mentioned in the blog using the new module available in PI7.1,but I am getting this error
    "Error: com.sap.conn.idoc.IDocMetaDataUnavailableException: (3) IDOC_ERROR_METADATA_UNAVAILABLE: The meta data for the IDoc type "ORDERS05" is unavailable."
    I have made every configuration correct and IDOC meta data available in both SAP R3 and PI,but it is still complaning about the meta data does not exist.
    Blog:  http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/13743%3Fpage%3Dlast%26x-order%3Ddate
    did anybody face issue with the new module available "SAP_XI_IDOC/IDOCFlatToXmlConvertor",please help me or give me mre information why I am getting this meta data error.
    Thank you,
    Sri

    Hi Sri,
    To Convert IDOC Flat file into IDOC xml from the given blog, the IDOC flat file should be present in standard format like:
    E2EDK01005                    0000047110815000000000001.........
    E2EDKA1003                    0000047110815000000000002.........
    E2EDKA1003                    0000047110815000000000003..........
    E2EDKA1003                    0000047110815000000000004........
    The Flat file have relationship as IDOC Number "000004711081500000" and segment sequence "0000001".
    If your flat file is not in this formate so i don't think that module is able to convert into IDOC xml. and if your file is already in this format then it may be issue with destinations which are created in NWA.
    Thanks
    Harish

  • Pipeline Delimeter issue in SSIS flatfile.

    I have flat file with pipeline delimeter with 9 columns, each columns having diffrent data lenth. i am having issue with one on of the column Address.
    User enter the value with PIPELINE symbol in his address, so that my developed SSIS packages got messed up bcoz my package is consider user enterted additional pipeline is one column. so my package is got failed. please help me, how to sort this issue or
    i need to consider this record as bad data.

    I would say you would have to consider this as bad data, and then process rest of the records as pipe is the only way one can identify the change in the columns, the way to identify bad data is, 
    a) Use a script task before your DFT, check each row for number of pipes expected ex: if you expect 10 columns then pipe should be 9 .
    b) Compare this condition and write a new file with only valid records and process this file.
    Abhinav http://bishtabhinav.wordpress.com/

  • Idoc to flatfile FCC issue

    Hello everybody,
    i am implementing an idoc-flatfile scenario (PI 7.1). It works to write an xml file into directory, as soon as I use FCC in order to generate txt-file, no file is written, and sxmb_moni shows an acknowledgement with system error. In both cases sxmb_moni shows the following acknowledgement:
    <?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
    - <!--
    Technisches Routing der Antwort
      -->
    - <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="1">
      <SAP:Category>XIServer</SAP:Category>
      <SAP:Code area="OUTBINDING">CO_TXT_OUTBINDING_NOT_FOUND</SAP:Code>
      <SAP:P1>,ZZ_CC_IBDLV</SAP:P1>
      <SAP:P2>,ZZ_CC_IBDLV,,</SAP:P2>
      <SAP:P3 />
      <SAP:P4 />
      <SAP:AdditionalText />
      <SAP:Stack>Es konnte keine Empfängervereinbarung für den Sender ,ZZ_CC_IBDLV zum Empfänger ,ZZ_CC_IBDLV,, gefunden werden.</SAP:Stack>
      <SAP:Retry>M</SAP:Retry>
      </SAP:Error>
    Inbound = idoc SHP_OBDLV_SAFE_REPLICA
    message type for outbound:
    ZZ_DT_OBDL     Complex Type          
    header     Element          1
    -test1     Element     xsd:string     1
    -test2     Element     xsd:string     1
    item     Element          0..unbounded
    -itemtest     Element     xsd:string     1
    -itemtest2     Element     xsd:string     1
    FCC:
    Recordset structure:                header,item,*
    header.addHeaderLine          1
    header.fieldNames               test1,test2
    header.fieldSeperator               ,
    header.processConfiguration          FromConfiguration
    header.endSeperator               'nl'
    item.addHeaderLine               1
    item.fieldNames               itemtest,itemtest2
    item.fieldSeperator               ,
    item.processConfiguration          FromConfiguration
    item.endSeperator               'nl'
    I also tried StrictXml2PlainBeanu2026 doesnu2019t work eitheru2026 thanks in advance,
    Lukas

    It works to write an xml file into directory, as soon as I use FCC in order to generate txt-file, no file is written
    --->
    use
    Recordset structure: header,item
    In receiver FCC, do not specify occurrences in recordset structure
    Also fieldNames parameter is also not required.
    RWB--> Message Monitoring, select Details of your message, you will find FCC errors here

  • Flatfile active sync issue

    Hi Friends,
    I am running 2 FF AS. Second FF AS depends upon first one has to be completed. I am not sure how should I make sure that second should not run unless first one is complete.
    Please throw if you have some idea, how else can I achieve this senario.
    TIA

    Insufficient data, Captain.
    Do you process all records in the second FF?
    Do you process a diff of records in the second FF?
    What are the frequencies of the FFAS schedule? first run/second run.. are they hourly, every 30 minutes, every 5 minutes, every minute?
    All these would have an impact on the design.
    Only way for sure is to look at Log of 1st FFAS, when the string " Poll complete.. setting schedule for..." (or what ever the exact text is) then and only then move the latest file of the second FFAS into the directory where the 2nd FFAS is looking.

  • Authorization issue in BPS

    Hi guys,
    I've the authorization issue in a BPS application, where a user can upload a flatfile into a BPS-cube, but only when I select in the authorization object S_RS_AUTH 0BI_ALL.
    Without selecting 0BI_ALL (another analysis authorization) yields to the message, that the user has not enough authorization...
    Now the user gets access to data in the BW reporting to all the organizational marks like the organization unit (0ORGUNIT).
    How is it possible to design the authorizations / analysis authorization, that the same user can upload data via flatfile, but gets only access to transaction data for organizational data which he should see???
    How should the analysis authorization should be designed? Has it something to do with the techn. char. like 0TCAACTVT?
    THX in advance!
    Clemens

    Hi,
    Have you tried creating Authorization Variable for organizational Unit ?
    This will give restricted access to data based on the authorization assigned .
    Thanks
    Pratyush

  • Unable to configure Flatfile Reconciliation in OIM 9.1.0.1

    Hello,
    I am facing this weird issue with OIM in my new project. I have followed the below steps to configure the flatfile reconciliation using as a trusted source.
    1. Transport Provider- Shared Drive
    2. Format Provider- CSV
    I have filled the location for the Staging Directory(Parent) and Archiving directory. I am using Cp1251 for File Encoding.
    The problem which I am facing is that as soon as I move forward from the above step for Mapping, I do not see any fields under the Source and Reconciliation Staging
    in Step 3: Modify Connector Configuration. What could possibly be the reason for this? The flat file is already there in the parent directory location before starting the configuration of Flatfile GTC. There are no logs generated for this for obvious reasons. It would be great if someone can reply soon as I have been facing this issue since yesterday morning.

    I created a new flat file and a new connector. Now I am facing a different issue. The data is getting reconciled as per the reconciliation events and the Status is "Event Received".
    Basic Mapping is;
    login->User_login
    firstname->First name
    lastname->Last name
    organization->Organization Name
    User Type->Xellerate Type
    Employee Type->Role
    Password-> hardcoded as "abcd@12345"
    The error which I am getting is;
    2012-10-27 09:03:29,370 DEBUG [XELLERATE.DATABASE] select evt.evt_key, evt.evt_name, evt.evt_package from dob dob, evt evt, dvt dvt where dob.dob_key=dvt.dob_key and dvt.evt_key=evt.evt_key and (dob.dob_name='com.thortech.xl.dataobj.tcADJ' or dob.dob_name='com.thortech.xl.dataobj.tcTableDataObj' or dob.dob_name='com.thortech.xl.dataobj.tcDataObj' ) and dvt.dvt_post_update_sequence>0 order by dvt.dvt_post_update_sequence
    2012-10-27 09:03:29,371 INFO [XELLERATE.PERFORMANCE] Query: DB: 1, LOAD: 0, TOTAL: 1
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ/eventPreInsert left.
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getAllowedOperation entered.
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getAllowedOperation - Data: mstableName - Value: adj
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getAllowedOperation:else entered.
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getAllowedOperation:if moData.isNull entered.
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:preWrite entered.
    2012-10-27 09:03:29,372 DEBUG [XELLERATE.DATABASE] select usr_key from usr where USR_LOGIN=? and USR_STATUS!='Deleted'
    2012-10-27 09:03:29,373 INFO [XELLERATE.PERFORMANCE] Query: DB: 1, LOAD: 0, TOTAL: 1
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:preWrite left.
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getPreparedStatementForUpdate entered.
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getWhere entered.
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getWhere - Data: psKeyName - Value: adj_key
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getWhere - Data: msKeyValue - Value: 14
    2012-10-27 09:03:29,373 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getWhere left.
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.SERVER] Class/Method: tcTableDataObj:getPreparedStatementForUpdate left.
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.DATABASE] update adj set ADJ_PERSIST=?, ADJ_INST_NAME=?, ADJ_UPDATE=?, adj_rowver=? where adj_key=14 and adj_rowver=HEXTORAW('0000000000000000')
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.PREPAREDSTATEMENT] Class/Method: tcDataBase/writeStatement - Data: psSql - Value: update adj set ADJ_PERSIST=?, ADJ_INST_NAME=?, ADJ_UPDATE=?, adj_rowver=? where adj_key=14 and adj_rowver=HEXTORAW('0000000000000000')
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.PREPAREDSTATEMENT] Class/Method: tcDataBase/writeStatement: Param: 1 is set to null
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.PREPAREDSTATEMENT] Class/Method: tcDataBase/writeStatement: Param: 2 is set to null
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.PREPAREDSTATEMENT] Class/Method: tcDataBase/writeStatement: Param (Timestamp): 3 is set to 2012-10-27 09:03:29.372
    2012-10-27 09:03:29,374 DEBUG [XELLERATE.PREPAREDSTATEMENT] Class/Method: tcDataBase/writeStatement: Param (ByteArray): 4 is set to java.io.ByteArrayInputStream@126ec30c
    2012-10-27 09:03:29,376 INFO [XELLERATE.PERFORMANCE] Query: DB: 2
    2012-10-27 09:03:29,377 DEBUG [XELLERATE.AUDITOR] Class/Method: AuditEngine/getAuditEngine entered.
    2012-10-27 09:03:29,377 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ/eventPostUpdat entered.
    2012-10-27 09:03:29,377 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ/updateAdpStatus entered.
    2012-10-27 09:03:29,377 DEBUG [XELLERATE.SERVER] Class/Method: tcDataBase/readPartialStatement entered.
    2012-10-27 09:03:29,377 INFO [XELLERATE.DATABASE] DB read: select adp.adp_key, adp_status, adp_rowver from adp adp, adt adt where adt.adp_key = adp.adp_key and adt.adt_key = 16
    2012-10-27 09:03:29,377 DEBUG [XELLERATE.DATABASE] select adp.adp_key, adp_status, adp_rowver from adp adp, adt adt where adt.adp_key = adp.adp_key and adt.adt_key = 16
    2012-10-27 09:03:29,378 INFO [XELLERATE.PERFORMANCE] Query: DB: 1, LOAD: 0, TOTAL: 1
    2012-10-27 09:03:29,378 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ:updateAdpStatus - Data: sAdpStatus - Value: Recompile
    2012-10-27 09:03:29,378 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ/updateAdpStatus left.
    2012-10-27 09:03:29,378 DEBUG [XELLERATE.SERVER] Class/Method: tcDataBase/readPartialStatement entered.
    2012-10-27 09:03:29,378 INFO [XELLERATE.DATABASE] DB read: select evt.evt_key, evt.evt_name, evt.evt_package from dob dob, evt evt, dvt dvt where dob.dob_key=dvt.dob_key and dvt.evt_key=evt.evt_key and (dob.dob_name='com.thortech.xl.dataobj.tcADJ' or dob.dob_name='com.thortech.xl.dataobj.tcTableDataObj' or dob.dob_name='com.thortech.xl.dataobj.tcDataObj' ) and dvt.dvt_post_update_sequence>0 order by dvt.dvt_post_update_sequence
    2012-10-27 09:03:29,379 DEBUG [XELLERATE.DATABASE] select evt.evt_key, evt.evt_name, evt.evt_package from dob dob, evt evt, dvt dvt where dob.dob_key=dvt.dob_key and dvt.evt_key=evt.evt_key and (dob.dob_name='com.thortech.xl.dataobj.tcADJ' or dob.dob_name='com.thortech.xl.dataobj.tcTableDataObj' or dob.dob_name='com.thortech.xl.dataobj.tcDataObj' ) and dvt.dvt_post_update_sequence>0 order by dvt.dvt_post_update_sequence
    2012-10-27 09:03:29,380 INFO [XELLERATE.PERFORMANCE] Query: DB: 1, LOAD: 0, TOTAL: 1
    2012-10-27 09:03:29,380 DEBUG [XELLERATE.SERVER] Class/Method: tcADJ/eventPostUpdat left.
    2012-10-27 09:03:29,380 DEBUG [XELLERATE.SERVER] Class/Method: tcDataObj/save left.
    2012-10-27 09:03:29,393 DEBUG [XELLERATE.ACCOUNTMANAGEMENT] Class/Method: UsernamePasswordLoginModule/initialize - Data: dburl - Value: {2}
    2012-10-27 09:03:29,393 DEBUG [XELLERATE.ACCOUNTMANAGEMENT] Class/Method: UsernamePasswordLoginModule/initialize - Data: dbuser - Value: {2}
    Please suggest what i should do now.
    Edited by: user13355494 on Oct 26, 2012 2:46 AM

  • Error in loading the data from flatfile

    Hello All,
           when am loading the data from flat file iam getting error  "ERROR4 WHEN LOADING THE EXTERNAL DATA"  is it because of mapping  in the transferrules or error in the flat file.
    Points will be assigned to all  helpful answer.
    Thanks & Regards
    Priya

    Hii Priya,
    The reasons may be the file is open, the format/flatfile structure is not correct, the mapping/transfer structure may not be correct, presence of invalid characters/data inconsistency in the file, etc.
    Check if the flatfile in .CSV format.
    You have to save it in .CSV format for the flatfile loading to work.
    Also check the connection issues between source system and BW or sometimes may be due to inactive update rules.
    Refer
    error 1
    Find out the actual reason and let us know.
    Hope this helps.
    Regards,
    Raghu.

  • Error while collecting Custom FlatFile related InfoCubes in CTS

    Hi,
    I am getting the error while collecting the FlatFile related custom InfoCube that the cube is not in the version A.  I did checked that the cube is in the Active version i.e. Revised Version = Active Version; Object Status = Active, executable.
    Any idea how to resovle this issue ?
    Here is the information on the error message.....
    Object 'PPY_C101' (CUBE) of type 'InfoCube' is not available in version 'A'
    Message no. RSO252
    Diagnosis
    You wanted to generate an object with the name 'PPY_C101' (in transport request CUBE) of type 'InfoCube' (TLOGO). This is, however, not available in the BW Repository database. It does not exist in the requested version A. If the version is 'D' then it is possible that an error arose during the delivery or installation. If the version is 'A' then the Object was either not created or not activated.
    System Response
    The object was not taken into account in the next stage of processing.
    Thanks,

    Hi,
    I am getting the error while collecting the FlatFile related custom InfoCube that the cube is not in the version A.  I did checked that the cube is in the Active version i.e. Revised Version = Active Version; Object Status = Active, executable.
    Any idea how to resovle this issue ?
    Here is the information on the error message.....
    Object 'PPY_C101' (CUBE) of type 'InfoCube' is not available in version 'A'
    Message no. RSO252
    Diagnosis
    You wanted to generate an object with the name 'PPY_C101' (in transport request CUBE) of type 'InfoCube' (TLOGO). This is, however, not available in the BW Repository database. It does not exist in the requested version A. If the version is 'D' then it is possible that an error arose during the delivery or installation. If the version is 'A' then the Object was either not created or not activated.
    System Response
    The object was not taken into account in the next stage of processing.
    Thanks,

  • Output Type to create FlatFile

    Hi,
    We have a scenario where we have to create flatfile as out. We have developed a program that can generate the required file. But this program has a selection screen that reauires the input as a PO.
    Is it possible to link this program to an Output type, so that whenever the output type is determined this program is run and the required flat file is generated?
    If yes, what are the changes We will have to do in the config and/or the ABAP code?
    harehesh

    Hi Ravi/RS,
    First of all thanks for the reply.
    Based on Ravi's suggestion we have introduced a form/ endform statement, also based on RS suggestion he have made sure we are not using any commit statement.
    The functionality seems to be working but we are facing with an issue here. In the standard output types when the out is successgully transmitted the output type changes to green status. In out case, out program is creating a file but it is not changing the status of the output type to green , instead it becomes red.
    Please help
    harshesh

  • Regarding performance issue in time dependent hierarchie.

    hai
    we are loading time dependent hierarchies from flatfile to bw,it is weekely load,we have nearly one million records loaded,we got an issues regarding change of these hierarchies timely.when ever there occurs a change in hierarchy a new row is added to the table,it is degrading the performance,can any one of you please suggest  how to over come performance related issues regarding time dependent hierarchies.
    Regards
    Srinivas.G

    hello deven,
    if you are only focusing on your application's performance
    on the aspect of usability i.e. less waiting time, fast
    response UI, i would personally suggest you to use AJAX.
    put some processing part on the browser side and making
    data retrieval from MI asynchronous... this way, your user
    don't have to wait for all data to be presented, but rather
    could work on the data presented in pieces (i.e. asynchronously).
    anyway try googling for AJAX...
    regards
    jo

  • Insert data Flatfile table to Parent and child tables

    Hi All,
        I have Flatfile table which is we getting from daily txt files and i want to populate flatfile data to parent as well as child tables. parent table has primary key as ID. This ID is foreign key of child tables. 
    Here i have mentioned our process.
    1. Flatfile duplicates has to remove with condition of daily date.
    2. Before Insert parent table have to check duplicate(we have Unique key of 4 columns in parent) if duplicate is there we have to delete duplicate then Insert into parent table unique records(Primary key is ID).(here why we are delete duplicate record meaning
    we getting daily files so if any records updated in future that record should be insert to parent and child table so only we delete old records and insert new records).
    3.After insert parent we have to populate child tables from Flatfile table as well as we have to insert parent table primary key as foreign key of child tables.
    4. if any truncation error occurs that errors should go to error log table.
    Right now we are using cursor for this concept and cursor has performance issue. so we are trying to optimize other way to populate parent and child table to increase performance to populate.
    Please help us to which way to do this process increase of performance .
    Please  can any one reply.

    Hi RajVasu,
    Since this issue is related to Transact-SQL, I will move this thread to Transact-SQL forum. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
    Thank you for your understanding and support.
    Regards,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

Maybe you are looking for

  • How do I use logic as a mulit timbre sound module???

    Whats up people, Im trying to trigger all the sounds in logic via a hardware controller (mpc) separating everything on separate channels. I got it to read on just one channel at a time, but multi?? Forget about it.... I read other posts in regards to

  • Anybody who can help me? I really need your help

    I've tried to execute J2ee tutorial examples. I don't know why this sample didn't execute...I've already finished to set up J2EE and enviornment variables. I was trying this part from j2ee tutorial titled as "Packing Web modules". Whenever I type "as

  • Does it have an audio out?

    Does it have an audio out?

  • Using like Operator in Prepeared Statement

    PreparedStatement s2=dbcon.prepareStatement("Select phone,dnc_message from gsw_donotcall_list_tab where phone=? or dnc_message like ?"); s2.setString(1,number); s2.setString(2,"%"+smessage+"%"); ResultSet result=s2.executeQuery(); rowfound=result.nex

  • MM downpayment process using ME2DP - result list

    Hi, I have a question/problem. The downpayment process using ME2DP works fine so far. But if I have finished a purchasing process by posting the down payment, the goods receipt and the final invoice, why is the PO not disappearing from the ME2DP list