SQL (Injection) Vs Flat File TXT Vs XML (XPath Injection) For Database.

I will have to deal with robust login sessions so I need some advice on what type of database and language I will use.
PHP and MySQL is the choice but I hear that flat file TXT are more faster so I am thinking about Perl.
Perl is very flexiple and have many advantages but why do many folks use PHP.
SQL and XPath injection are becoming more frequent plus MD5 and SH512 have been cracked by some angry folks.
The Perl community is becoming very small so it will be difficult to receive support if I have a specific problem.

In our projects we are extracting SMS messages and
based on the
MISDN (Telephone number) of each SMS we are fetching
the corresponding
account information (server name and password) from
the database through (connecting through)
webservice.The data is stored in the DB which you are accessing through a webserivce and that's too slow and unreliable so you are caching the results in a text file...
That's horrible. The party responsible for the webservice should be fixing it or you should rethink the architecture. All this extra complexity and basically it's just causing problems for you. You'd be better off just accessing the DB directly. I assume this is possible since you know there is a database. In a good SOA architecture, you should not know or care about where the data is persisted.
This webservice connection is slow/unreliable so a
local copy of the
MISDN number and corresponding account information
are stored/updated in flat files tp prevent repeated
accessing through webservice.
My question is-which will be faster for accessing,
using txt file or xml or a different format.I don't think it's going to matter much at all for the numbers you site, unless you are running this on a 20 year old PC or something.
But I think you are addressing the wrong problem.

Similar Messages

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • Create an outbound flat file .txt file from SAP to UAPM.

    How to Create an outbound flat file .txt file from SAP to UAPM (Unicenter Asset Portfolio Management).

    What do u want in flat file and what is your doubt please let me know as iam working i can help u

  • How to convert Flat file(.txt) data to an Idoc format(ORDERS05)

    Hi,
    How to convert Flat file(.txt) data to an Idoc format(ORDERS05). If any FM does the same work please let me know.
    thanks in advance,
    Chand
    Moderator message : Duplicate post locked. Read forum rules before posting.
    Edited by: Vinod Kumar on Jul 26, 2011 11:11 AM

    Hi,
            For more information, please check this link.
    http://sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/46759682-0401-0010-1791-bd1972bc0b8a
    Have a look at the FM IDOC_XML_FROM_FILE. May be it helps...
    Regards

  • Converting Flat File data into XML

    Hi Experts,
    Consider the message type of the SENDER system and flat file data
    <dt_sender>
    <root>
    <header1>   0..1
        <f1>
        <f2>
        <f3>
    <header2>   0..1
        <f4>
        <f5>
        <f6>
    <item>        1..unbounded
        <f7>
        <f8>
        <f9>
        <f10>
        <f11>
        <f12>
    </item>
    abc     def     ghi     jkl     mno     pqr
    123     123     123     123     123     123
    456     456      456     456     456     456
    how to convert the flat file data into following XML data. please note that each field value is separated by TAB delimeter...wht parameters shld b used
    <root>
        <Header1>
            <f1>abc</f1>
            <f2>def</f2>
            <f3>ghi</f3>
        </Header1>
        <Header2>
            <f4>jkl</f4>
            <f5>mno</f5>
            <f6>pqr</f6>
        </Header1>
        <item>
            <f7>123</f7>
            <f8>123</f8>
            <f9>123</f9>
            <f10>123</f10>
            <f11>123</f11>
            <f12>123</f12>
            <f7>456</f7>
            <f8>456</f8>
            <f9>456</f9>
            <f10>456</f10>
            <f11>456</f11>
            <f12>456</f12>
        </item>
    points will be given to the correct answers
    Thanks in advance.
    FAisal
    Edited by: Abdul Faisal on Feb 29, 2008 5:53 AM

    Faisal,
    When you read the multiple recordset strucutre file then each record in txt file should have an header from which you can identiy which segment it should go.. and you identiy it by using the keyfiledValue in file adapter
    <root>
    <header1> 0..1
    <f1>
    <f2>
    <f3>
    <header2> 0..1
    <f4>
    <f5>
    <f6>
    <item> 1..unbounded
    <f7>
    <f8>
    <f9>
    <f10>
    <f11>
    <f12>
    </item>
    for this input file
    abc def ghi jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc def ghi can be read using the file adater to header 1 usinfg key field value, but using the same file adapter you cannt put GHI into header2.
    else you should read whole row abc def ghi jkl mno pqr in single filed and write an UDF to split data to header1 and Header 2
    similarly you have to take care for item records also
    if your inout file is something like this
    abc def ghi
    jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc identifies to Header 1
    JKL for Header 2  so on...
    read the whole line in single field  and write UDF to Split to header 1 and header 2 similary for item.

  • Converting Idoc flat file representation to XML

    Hi ,
    I went through the guide for How To Convert Between IDoc and XML in XI 3.0. I'm concerned with the second part of the guide which says convert from falt file representation of Idoc to XML. Can anyone tell me what are the other design and configuration objects to be created for this scenario ( message types,interfaces, mapping , etc )
    Also which step of the pipeline does the converted XML goes to ?
    The program also expects a filename, what if I want to pass the file name dynamically ? Any ideas on this one.
    Hope someone replies this time.........:)
    Thanks for you help and improving my knowledge
    Thanks
    Advait Gode.

    Hi Advait,
    Let me give you a small overview on how inbound IDOCs work before answering your question-
    The control record is the key in identifying the routing of the IDOC. If you try to think IDOCs as normal mails(post), the control record is the envolope. It contains information like who the sender is and who the receiver should be and what the envelope contains (no different than receiving mails/letters by post).
    Then the data records contain the actual data, in our example would be the actual letter. The status records contain the tracking information.
    Traditionally SAP's IDOC interface (even before XI comes in picture) has utility programs to post incoming IDOCs in to SAP. One such program is RSEINB00 which basically takes  the IDOC file name and the port as input. This program opens the file and posts the contents to the SAP IDOC interface (which is a set of function modules) via the port. The idea is to read the control record and determine the routing and further posting to application. Note that one information in the control record is the message type/idoc type which decides how the data records need to be parsed.
    Now in XI scenario, what happens if we receive data as flat file? Normally, we use flat file adapter and in the file adapter we provide information on how to parse the file. But, if the incoming file is flat and in IDOC structure, why do we have to configure the file adapter, when the parsing capability is already available using RSEINB00/Standard IDOC interface.
    This the reason, the guide suggests you to use RSEINB00. Now, your concern is what if you need to provide a dynamic filename. My idea is to write a wrapper program. This would be an ABAP program in your integration engine. This program will determine the file name (based on a logic which should be known to you) and then call program RSEINB00 using a SUBMIT/RETURN. You would then schedule this ABAP program in background to run in fixed schedules.
    There are other ways of handling your scenario as well but from limited information from your request, I will stop with this now. Post me if you have any more queries.
    KK

  • I can't build an xsd for a flat file (txt) to handle repeating records

    Hi - have looked at many posts around flat file schema and they don't seem to address my question.
    I have a flat file that is \n delimited
    the pattern of the data is simple:
    record1 - 90 characters
    record2 - 20 characters
    record3 - n 248 characters - each of these records is parsed into children by the positional method
    record n+1 10 characters
    record n+2 20 characters
    so I used the flat file schema generator to generate the schema and built a map mapping the flat file schema to another xml schema. The schema looks ok - record1, record2, record n+1, record n+2 are child elements of the root. the repeating record
    section is showing up as a node with the parsed children.
    The transform is only mapping the children of the repeating records. When I test the map only the first repeating record gets parsed. No repeating happens (the actual flat file has 400+ repeating records). When I run the map in debug mode, the input
    xml shows that record1 is read in correctly, record2 is read in correctly, record3 is read in and parsed and record4 is treated like record n+1 and record5 is treated like record n+2 and the map thinks it's all finished.
    the section of the repeat part of the schema is and you can see that I set the minOccurs=1 and maxOccurs=unbounded for the node (INVOICE) and the complexType but this is not an affective syntax. I have looked at how the EDI X12 schema look and how they handle
    looping and it is a lot different than what the Flat File schema wizard is doing. Is there a good set of rules published that would guide me though this? otherwise I will basically have to read in the lines from the file and parse them out with functoids -
    seems so inelegant. Thanks in advance.
    <xs:element minOccurs="1" maxOccurs="unbounded" name="INVOICE">
              <xs:annotation>
                <xs:appinfo>
                  <b:recordInfo structure="positional" sequence_number="3" preserve_delimiter_for_empty_data="true" suppress_trailing_delimiters="false"
    />
                </xs:appinfo>
              </xs:annotation>
              <xs:complexType>
                <xs:sequence minOccurs="1" maxOccurs="unbounded">
                  <xs:annotation>
                    <xs:appinfo>
                      <groupInfo sequence_number="0" xmlns="http://schemas.microsoft.com/BizTalk/2003"
    />
                    </xs:appinfo>
                  </xs:annotation>
                  <xs:element name="SegmentType" type="xs:string">
                    <xs:annotation>
                      <xs:appinfo>
                        <b:fieldInfo justification="left" pos_offset="0" pos_length="2" sequence_number="1" />
                      </xs:appinfo>
                    </xs:annotation>
                  </xs:element>....... more children elements
    Harold Rosenkrans

    Thanks for responding
    I gave up trying to parse the repeating record into fields. Instead I just loop through the repeating record section with an <xs:for-each> block in the xsl and use functoids to grab the fields.
    So that works for having the two, shorter header records (structure is positional) before the section of repeating records. Now I just have to figure out how to get the schema to handle the two, shorter trailer (or footer, whatever you prefer) records after
    the section of repeating records
    the error I get in VS when I test the map is [BTW I changed the element names in the schema which is why you don't see INVOICE in the error]
    When I declare the last element as being positional with a character length of 10 I get the error:
    Error 18 Native Parsing Error: Unexpected end of stream while looking for:
    '\r\n'
    The current definition being parsed is SAPARData. The stream offset where the error occured is 1359. The line number where the error occured is 9. The column where the error occured is 0. 
    so the first record is 77 char in length and the second is 16 char and then the repeating records (5 in the file) are 248 char and the last record is 10 char
    so an offset of 1359 puts it beyond the last record by 16 characters - so the stream reader is looking for the next repeating record.
    if I try to declare the last element as delimited I get the error:
    Error 14 Native Parsing Error: Unexpected data found while looking for:
    '\r\n'
    The current definition being parsed is SAPARData. The stream offset where the error occured is 597. The line number where the error occured is 5. The column where the error occured is 0. 
    so the first record is 77 char in length and the second is 16 char and then the repeating records are 248 char.
    a stream offset of 597 puts me 8 characters into the third repeating record - at this point I have only declared one trailer record in the  schema, 10 characters long.
    Why is stream reader stopping at such a weird spot?
    The bottom line is I still haven't discovered the correct schema to handle the trailer records. even if I set the maxOccurs="4" (for the repeat record declaration) it still gets the first error. How does it find an unexpected end of stream looking
    for \r\n when the maxOccurs for the repeat record declaration should have the stream pointer in the 5th repeat record.
    I unfortunately don't have any options concerning the file structure.
    I have read a lot of posts concerning the trailer issue. I have seen a couple that looked interesting. I guess I'll just have to give them a try. The other option is to create a custom pipeline that will only take file lines of 248 characters.
    That's just disgusting !
    Harold Rosenkrans

  • EDI flat file to X12 xml using XSLT mapping

    Hi all,
    I have a scenario EDI File -> XI -> file. Here on the source side, it is a txt IDOC document. I have created an XSLT mapping to convert txt document to X12 xml.
    Can any body please suggest that what should be the message type that i need to choose at source inbound message?
    Thanks
    -Kulwant

    Hi
    It is not very clear from what you have explained above..
    1) whats the format when the msg enters XI?
    2) which stage of the flow this XSLT mapping is located...??
    3) when you say that your XSLT converts txt to xml, then whats the ROOT tag you use??
    4) whats your incoming msg structure??
    make the above clear for better answers
    Regards
    Vishnu

  • Flat File to simple XML structure in Mail Sender Adapter

    Hi,
    I have a scenario, where I want to put the content of a flat file (text, no csv or similar), which is an attachement of an e-mail, into a simple XML structure: entire file content as content of one XML tag. E.g.:
    file content:
    "abcdefgh"
    xml file:
    <root>
      <content>abcdefgh</content>
    </root>
    Do I need to use MessageTransformBean? Or is there an easiert way?
    Thanks,
    Torsten

    Hi Dirk,
    When we use MessageTransform, we can use ContentDisposition to specify, as to whether the payload has to go as an attachment or inline(as the mail itself.)
    It could also be a text file. Right?
    Just take a look at this..
    http://help.sap.com/saphelp_nw2004s/helpdata/en/57/0b2c4142aef623e10000000a155106/frameset.htm
    cheers,
    Prashanth

  • Debatch Flat File in to XML

    Hi,
    First I Created a Flat file in below format having 3 records. When I Validate Flat File Schema , I can see the XML single xml file generating with 3 records.
    Shakeer,Newyork,US
    Hussain,Chicago,US
    Shahid,Newyork,US
    Later my goal is to split each record and generate each record as a single XML file. I did modifications in my Flat File Schema. After validating Schema it generates a
    single record XML file. Up to now its cool.
    Now I want to apply my modified schema in Existing Application
    in BizTalk console. How to do it?
    When I drop my Flat file in Receive location the I can see my XML file in send location not as a single record XML file but as
    all records in one single file.
    Please help how to resolve it?
    PS.Shakeer Hussain

    Hi Syed,
    Do the following simple steps for your requirement:
    Set “Allow Message Breakup At Infix Root” to “Yes” by selecting “Schema”
    Set “Max Occurs” to 1 by selecting the Record where you want to split.
    Create a Receive pipeline with "Flat File Disassembler" in the Disassemble stage.
    Deploy the above artifacts; Flat File Schema, Receive pipeline with Flat File Disassembler component.
    Configure the Receive port with Receive pipeline and send port with filter for Receive port (BTS.ReceviePortName) or after debatching do whatever you want to do..
    Obviously you have other ways like calling the Receivepipeline in orchestration if you have many message/records to be debatched and reduce the number of footprints in message-box db. But start
    with the above steps and move on to other advanced solution based on your need.
    You can follow the reference here..
    How to Debatch (Split) a Flat File using Flat File Schema ? 
    Another reference from our friend Mahesh-
    Debatching(Splitting) XML Message - BizTalk 2010
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Export SQL View to Flat File with UTF-8 Encoding

    I've setup a package in SSIS to export a SQL view to a flat file and it's working fine.  I now need to make that flat file UTF-8 encoded.  The package executes but still shows the files as ANSI encoded.
    My package consists of a Source (SQL View) -> Derived Column (casts the fields to DT_WSTR) -> Destination Flat File (Set to output UTF-8 file).
    I don't get any errors to help me troubleshoot further.  I'm running SQL Server 2005 SP2.

    Unless there is a Byte-Order-Marker (BOM - hex file prefix: EF BB BF) at the beginning of the file, and unless your data contains non-ASCII characters, I'm unsure there is a technical difference in the files, Paul.
    That is, even if the file is "encoded" UTF-8, if your data is only ASCII values (decimal values 0-127, hex 00-7F), UTF-8 doesn't really serve a purpose over ANSI encoding.  Now if you're looking for UTF-8 with specifically the BOM included, and your data is all standard ASCII, the Flat File Connection Manager can't do that, it seems.
    What the flat file connection manager is doing correctly though, is encoding values that are over decimal 127/hex 7F in UTF-8 when the encoding of the connection manager is set to 65001 (UTF-8).
    Example:
    Input data built with a script component as a source (code at the bottom of this post) and with only one WSTR output column hooked to a flat file destination component:
    a string containing only decimal value 225 (german Eszett character - ß)
    Encoding set to ANSI 1252 looks like:
    E1 0D 0A (which is the ANSI encoding of the decimal character value 225 (E1) and a CR-LF (0D 0A)
    Encoding set to UTF-8 65001 looks like:
    C3 A1 0D 0A  (which is the UTF-8 encoding of the decimal character value 225 (C3 A1) and a CR-LF (0D 0A)
    Note that for values over decimal 127, UTF-8 takes at least two bytes and up to four for the remaining values available.
    So, I'm comfortable now, after sitting down and going through this, that the flat file connection manager is working correctly, unless you need a BOM.
    1
    Imports System  
    2
    Imports System.Data  
    3
    Imports System.Math  
    4
    Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper  
    5
    Imports Microsoft.SqlServer.Dts.Runtime.Wrapper  
    6
    7
    Public Class ScriptMain  
    8
        Inherits UserComponent  
    9
    10
        Public Overrides Sub CreateNewOutputRows()  
    11
            Output0Buffer.AddRow()  
    12
            Output0Buffer.col1 = ChrW(225)  
    13
        End Sub 
    14
    15
    End Class 
    Phil

  • IDOC --- XI -- HTTP (via a flat file structure, not XML)

    I am working on a senario in which we need to send a third party payroll service provider our HR Master Data records (Message type HROT_UM).  The manual method is to create a flat file from an IDOC and place the file in a shared directory, then upload the flat file by logging on to the web server.  We would like to automate this process using XI.
    Is it possible to send a flat file structure to a webserver using a HTTP receiver adapter?  If not, can you provide a basic view of how to accomplish this task.  Please note that FTP is not an option for us. 
    Any suggestions or recommendations would be greatly appreciated.

    hey
    flat file over HTTP is not possible,it takes only XML.
    have a look at the following thread
    Send file through http
    thanx
    ahmad
    PL:reward with points for helpful answers

  • Flat file transformation to xml with encoding in iso-8859-1

    We have a BPEL process that picks up a flat fle (fixed length), transforms it to xml and emails it. The flat file is in iso-8859-1 (when I look at special characters like e-accent's in hex).
    The basic transformation that you get from (the wizards) of BPEL don't do the trick, the special characters are corrupted and unreadable.
    After that I changed all the encoding's (in xsd's and xsl's) from utf-8 into iso-8859-1 but doesn't help either. I also added an xsl:output section to my xsl (with "encoding="iso-8859-1") but that also fails...
    Does anyone have a clue? We're using SOA Suite 10.1.3.1.
    greetings,
    Jan

    just tried that but the problem is still there
    I suspect that the output of the file adapter (reading the flat file) is already corrupted, you should be able to tell the file adapter what encoding the input file is...

  • Transfer data from Result Set (Execute SQL Task) into Flat File

    Hi to all,
    My problem is this:
     -) I have a stored procedure which use a temporary table in join with another "real" table in select statement. In particular the stored procedure look like this:
    create table #tmp
    col1 varchar(20),
    col2 varchar(50)
    --PUT SOME VALUE IN #TMP
    insert into #tmp
    values ('abc','xyz')
    --SELECT SOME VALUE
    select rt.*
    from realTable rt, #tmp
    where rt.col1 = #tmp.col1
    -) I cannot modify the stored procedure because I'm not admin of database.
    -) I HAVE THE REQUIREMENT OF TRANSFER DATA OF SELECT STATEMENT OF SOTRED PROCEDURE INTO FLAT FILE
    -) THE PROBLEM is that if I use an OLEDB source Task within a Data Flow Task I'm not be able of mapping column from OLEDB source to flat file destination. The reason for this, is that in the "Column page" of OLEDB source task, SSIS do not retrieves
    any column when we using a temporary table. The reason for this, is that SSIS is not be able to retrieve metadata related to temporary table. 
    -) One possible solution to this problem is to use an Execute SQL Task to invoke the stored procedure, store the result returned from stored procedure in a Result Set through a Object type user variable.
    -) The problem now is: How to transfer data from result set to flat file?
    -) Reading here on this forum the solution look be like to use a Script task to transfer data from result set to flat file.
    QUESTIONS: How to transfer data from result set to flat file using a script task?? (I'm not an expert of vb .net) Is it really possible?? P.S.: The number of row returned of stored procedure is very high!
    thanks in advance.

    Hi  Visakh16<abbr
    class="affil"></abbr>
    thanks for the response.
    Your is a good Idea, but I do not have permission to use DDL statement on the database in production environment.

  • Flat File Active Sync doesn't work for account creation without unique id

    Hi,
    I'm trying to set up a FlatFileActiveSync for creation and update of accounts in IDM 7.0. I've followed the below steps for this purpose :-
    1) Create a correlation rule (confirmation rule not reqd in my case).
    2) Create a proxy admin and assign him a empty form. Also give him control over Top organisation.
    3) Create a Flat-File Resource Adapter.
    4) Create ActiveSync input form using the (Active Sync) wizard.
    5) Start Active Sync...
    My feed file contains only 3 fields firstname, lastname, email Id.
    My correlation rule has the logic of matching up with IDM accounts(Lighthouse accountId) by taking first letter of firstname and concat with lastname from the data coming from feed file.
    Now everything works fine for account updates i.e. if I change somebody's email Id who already exists in IDM I can actually see the changed email Id in Configurator's console.
    But if I put in a record that doesn't exist, and which I expect to be created, it gives me an error.
    Although, if I introduce a unique identifier in my feed file and link it with Lighthouse.accountId the account creation works fine.
    Is this a limitation or I'm not doing something right ?
    Exception I saw in resource log with log level 4 :
    2007-04-30T10:02:12.291-0400: Error Processing Line: {lastname=Pogu, firstname=Gogu, [email protected]}
    com.waveset.adapter.iapi.IAPIException: There was a conflict with the record [{lastname=Pogu, firstname=Gogu, [email protected]}]
    and no resolution process has been specified on the adapter.
    It is recommended that you define the process for handling unmatched accounts
    on this load process.
    2007-04-30T10:02:12.292-0400: Poll complete.
    2007-04-30T10:02:12.292-0400: SARunner: loop 1076
    2007-04-30T10:02:12.314-0400: Started, paused until Mon Apr 30 10:07:12 EDT 2007
    2007-04-30T10:07:12.024-0400: Pause completed
    2007-04-30T10:07:12.038-0400: Polling
    2007-04-30T10:07:12.056-0400: Error Processing Line: {lastname=Poker, firstname=Hoker, [email protected]}
    com.waveset.adapter.iapi.IAPIException: There was a conflict with the record [{lastname=Poker, firstname=Hoker, [email protected]}]
    and no resolution process has been specified on the adapter.
    It is recommended that you define the process for handling unmatched accounts
    on this load process.

    That logic is in my correlation rule as I specified in my initial post and here's the XPRESS code for it :-
    <?xml version='1.0' encoding='UTF-8'?>
    <!DOCTYPE Rule PUBLIC 'waveset.dtd' 'waveset.dtd'>
    <!-- MemberObjectGroups="#ID#Top" description="Find out if a resource account is correlated to an IDM account" id="#ID#D23CC16ECF6E5D42:-4527465C:11224925657:-769F" lastMod="61" lastModifier="Configurator" name="HR_DB_CORR" subtype="SUBTYPE_ACCOUNT_CORRELATION_RULE"-->
    <Rule subtype='SUBTYPE_ACCOUNT_CORRELATION_RULE' id='#ID#D23CC16ECF6E5D42:-4527465C:11224925657:-769F' name='HR_DB_CORR' creator='Configurator' createDate='1177449448746' lastModifier='Configurator' lastModDate='1177686884156' lastMod='61'>
    <Description>Find out if a resource account is correlated to an IDM account</Description>
    <cond>
    <and>
    <notnull>
    <ref>firstname</ref>
    </notnull>
    <notnull>
    <ref>lastname</ref>
    </notnull>
    </and>
    <block>
    <concat>
    <substr>
    <ref>firstname</ref>
    <i>0</i>
    <i>1</i>
    </substr>
    <ref>lastname</ref>
    </concat>
    </block>
    <s>false</s>
    </cond>
    <MemberObjectGroups>
    <ObjectRef type='ObjectGroup' id='#ID#Top' name='Top'/>
    </MemberObjectGroups>
    </Rule>
    Although this is not specified in Active Sync input form but in the correlation rule attribute of Active Sync config (using the wizard). Do I need to specify it there using the Field function.
    Also, I figured out today that I needed to restart IDM instance after changing the value of "Create Unmatched Accounts" flag and now the error is as below :-
    <WavesetResult>
    <ResultItem type='error' status='error'>
    <ResultError throwable='com.waveset.util.WavesetException'>
    <Message id='SES_VIEW_CHECKIN_ERROR'>
    </Message>
    <StackTrace>com.waveset.util.WavesetException: Unable to checkin view. No account ID specified.&#xA;&#x9;at com.waveset.view.UserViewer.checkinView(UserViewer.java:1165)&#xA;&#x9;at com.waveset.object.ViewMaster.checkinView(ViewMaster.java:727)&#xA;&#x9;at com.waveset.sync.IAPIUserImpl.processCommand(IAPIUserImpl.java:526)&#xA;&#x9;at com.waveset.sync.IAPIUserImpl.submitCreate(IAPIUserImpl.java:195)&#xA;&#x9;at com.waveset.sync.IAPIUserImpl.submit(IAPIUserImpl.java:749)&#xA;&#x9;at com.waveset.adapter.FlatFileActiveSyncAdapter.processLine(FlatFileActiveSyncAdapter.java:404)&#xA;&#x9;at com.waveset.adapter.FlatFileActiveSyncAdapter.processFlatFile(FlatFileActiveSyncAdapter.java:350)&#xA;&#x9;at com.waveset.adapter.FlatFileActiveSyncAdapter.poll(FlatFileActiveSyncAdapter.java:307)&#xA;&#x9;at com.waveset.task.SARunner.doRealWork(SARunner.java:288)&#xA;&#x9;at com.waveset.task.Executor.execute(Executor.java:154)&#xA;&#x9;at com.waveset.task.TaskThread.run(TaskThread.java:132)&#xA;</StackTrace>
    </ResultError>
    </ResultItem>
    </WavesetResult>

Maybe you are looking for

  • Need suggestion in BENEFITS for a STD Plan

    My client has STD plan with standard 60% and 100% coverage, But the no. of weeks employee gets 100% coverage depends on employee's years of service. And the cost is 1.45 per $1000 of coverage....max 12,000 Ex:   <u>Years of Service</u>         -     

  • Problem to showing ilustration album ipod video itunes 7

    Hello, I am an usuary registrered on itunes store,I copy the album ilustrations and paste into the songs of the ipod, but my ipod don´t show it, and when I try to pick in the option of showing album ilustrarion in the ipod options inside itunes 7 it

  • Help with a Program I am writing!  Please!

    Hello All, This is my first post...I hope to find some answers to my program I am writing! A couple things first - I am 17, and have designed a comp. science class for myself to take while a senior in high school, since there was not one offered. My

  • IWeb doesn't offer widget for Facebook "Like" button?

    Is it true that iWeb doesn't offer a widget for adding a Facebook LIKE button? The code offered to copy by Facebook doesn't work when I paste it into a text box on my iWeb site. Thanks for any insights fellow users can offer. My free trial will be en

  • Default File Type for New Forms - How to change?

    Hi, How can I change propperty: Default File Type for New Forms? It is enabled! Do I have to be Administrator to change it? I have SAP ALL.. SFP -> Layout -> Tools --> Options -> Document Handling tnx, Adibo.