Using a text file to make a query

Hello and sorry to disturb.. but as a newbie I hope to find solution to my problem...
My users produce a text file that contains for each line an id (primary key).
My goal is to let them easily use that file as a part of a query that I define. e.g. they provide that list and the interface outputs the values related to those ids.
I read that is easy to import a file but I haven't understood how I can transform it's content in a query. Or maybe I can create a text field where the user can paste this list but I think that there's the limit of 30kb ...
Thanks in advance for any answer!
Simone

Thanks Vikas for your answer,
I already looked on the forum searching for all those sub-topics. The only thing that I'm not yet capable is to parse the blob content.. I'm lookin into the forum but I'm not finding what I need.. if you could gently give me the link to the right discussion I'd really apprecciate.
Thanks in advance

Similar Messages

  • Importing schema+ data using a text file containing sql commands

    I have the 2012 SQL SVR Mgmt Studio installed. I received a file--snippet below... from an export of our current DB provider, Advantage. I know I have to edit a lot of commands, but my question is quite simple--how do I use a text file with commands in it
    like below to create tables, columns and load data into SQL SVR.
    ---data.txt file contents----
    -- Table Type of Beneficiary is ADT
    Create Table Beneficiary(
       PlanId Char( 9 ),
       Hash Integer,
       Line1 Char( 128 ),
       Line2 Char( 128 ),
       Batch Char( 16 ) );
    EXECUTE PROCEDURE sp_CreateIndex90( 'Beneficiary', 'LAC_LACV1_Beneficiary.adi', 'KEY1', 'Hash;PlanId', '', 2051, 4096, NULL );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 983, 'Judith Pursifull~Spouse~100~~~', 'Michael Pursifull~Son~50~Thomas Pursifull~Son~50', '761011042' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 996, 'Anna Brownlow~Spouse~100~~~', 'Kaitlin Brownlow~Daughter~85~Lawanda Brownlow~Mother~15', '49036615' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 1005, 'Weldon Shelton~Other~50~Star Shelton~Other~50', 'Danica Shelton~Sister~50~Ryan Shelton~brother~50', '109075816' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 1031, 'Donald D Duffy~Spouse~100~~~', 'Brandon M Duffy~Son~100~~~', '219979254' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 1063, 'Lynne Roffino~Other~50~John Roffino~Other~50', 'Katy Roffino~Sister~50~Margaret Roffino~Sister~50', '604232358' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 1062, 'John Teaven Redstone~Spouse~100~~~', '~~~~~', '482024691' );
    INSERT INTO "Beneficiary" VALUES( 'LACV1', 1032, 'Judith Anne Brown (for cat care)~Other~50~Judith Ann (Kappler) Barklage~Other~50', 'LaVerne Cocke (for cat care)~Friend-PantegoBibl~50~~~', '358324107' );
    -- Table Type of Date is ADT
    Create Table Date(
       Hash Integer,
       PlanId Char( 9 ),
       Type Char( 24 ),
       Date Date,
       Override Logical,
       Batch Char( 32 ) );
    EXECUTE PROCEDURE sp_CreateIndex90( 'Date', 'LAC_LACV1_Date.adi', 'KEY1', 'Hash;Date;PlanId;Type', '', 2051, 4096, NULL );
    INSERT INTO "Date" VALUES( 1018, 'LACV1', 'HARDSHIPEND', '2010-02-20', False, '20090820-9414719' );
    INSERT INTO "Date" VALUES( 1018, 'LACV1', 'HARDSHIPSTART', '2009-08-20', False, '20090820-9414719' );
    INSERT INTO "Date" VALUES( 1001, 'LACV1', 'HARDSHIPEND', '2010-02-06', False, '20090806-9371968' );
    INSERT INTO "Date" VALUES( 1001, 'LACV1', 'HARDSHIPSTART', '2009-08-06', False, '20090806-9371968' );
    INSERT INTO "Date" VALUES( 1022, 'LACV1', 'LUMPSUMDISTRIBUTION', '2009-07-21', False, '20090721-9337640' );
    charles.leggette

    from an export of our current DB provider, Advantage.....
    -- Table Type of Date is ADT
    Create Table Date(
       Hash Integer,
       PlanId Char( 9 ),
       Type Char( 24 ),
       Date Date,
       Override Logical,
       Batch Char( 32 ) );
    Hello Charles,
    The SQL Syntax especially for DDL commands are different between MS SQL Server and Advantage Database, so you have to modify them to get them work.
    As David already wrote we don't have a stored procedure "sp_CreateIndex90" in SQL Server, you may remove these commands, and we also don't have a data type "Logical", next could be "bit"
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • Can you use an XML file to make Subclips for you?

    Hi everyone,
    I know this is a bit of a long-shot, but I am wondering if I can use an XML file(made after logging a few tapes, but before capture) to make subclips for me?
    Basically I have a painful project with 2 tapes i had to capture separately as 1 long individual clip each due to time-code breaks, but as i did first Log & Capture as usual (only to later find it wouldn't batch capture) i didn't want my hard work to go to waste!
    Hope I explained that clearly enough,
    Adam
    Message was edited by: Chocboy

    I'm assuming you captured using the non-controllable device setting and have the clip including TC breaks and blank areas etc.
    Try this:
    Load the clip into the Viewer and play it. Press M to set markers wherever you want them.
    If you want to give the marker a more decriptive name, press M a second time whilst the playhead is positioned on the marker. A dialog will open with text input fields. Or, click the marker in the Browser, then click the marker’s name to select it. You can now change the name.
    Drag in the Browser to select all the markers or click the first one, then shift click the last one.
    Choose Modify > Make Subclip. This will make all the subclips in one go and if you gave your markers new names, your subclips will use them.
    The subclips appear in addition to the original clip with the markers. You can rename the subclips, if you want. You can review the subclips, deleting any clips you might not need. If you do remove unused clips, you can use the Media Manager to remove your unused footage from disk, leaving the media for your remaining subclips alone.

  • Exit labview (executables) after using large text files

    Hello,
    I am using LabView 6.0 and his aplication builder / runtime engine. I wrote some VI`s to convert large Tab delimited textfiles (up to 50 mb). When I am finished with the file it is staying in the memory somehow and is staggered with other (text)files in such a way the computer is slowing down.
    When I want to exit the VI (program) it will take a very long time to get  lost of the program (resetting LabView) and get my speed back.
    How kan I solve this problem for these large files?
    Martin.

    OK, this may be a bit of a problem to track down, but let's start.
    First, while your front panel looks great, your code is very hard to read. Overlapping elements, multiple nested structures and a liberal use of locals make this a problem. My first suggestion would be to with a massive cleanup operation. Make more room, make wires straight, make sure things are going left-to-right, make subVIs, place some documentation and so on. You won't believe the difference this makes.
    After you did that, we can turn to find the problems. Some likely suspects are the local variables and the array functions. You use many local variables and perform resizing operations which are certain to generate copies. If you do this on arrays with dozens of MBs of data, this looks like the most likely source of the problem. Some suggestions to deal with this - if you have repeating code, make subVIs or move the code outside of the structures, so that it only has to be run once. Also, you seem to have some redundant code. For instance, you open  the file only to see if you get an error. You should be able to do this with the VIs in the advanced palette without opening it (and you won't need to close it, either). Another example - you check the exit conditions in many places in your code. If your loop runs fast enough, there is no need for it. Some more suggestions - use shift registers instead of locals and avoid setting the same properties over and over again in the loop.
    After you do these, it will probably be much easier to find the problem.
    To learn more about LabVIEW, I suggest you try searching this site and google for LabVIEW tutorials. Here and here are a couple you can start with. You can also contact your local NI office and join one of their courses.
    In addition, I suggest you read the LabVIEW style guide and the LabVIEW user manual (Help>>Search the LabVIEW Bookshelf).
    And one last thing - having the VI run automatically and then use the Quit VI at the end is not very nice. Since you are building it, it will run automatically on its own and you can use the Application>>Kind property to quit only if it's an executable.
    Try to take over the world!

  • Problem in Using a Text file

    Hello,
    I want to read my text from a text file and it�s not a predetermined text, what i want to do is read the records in the form of text and i want to use "," as the delimiter between each Column in my Output file.
    I was trying to do this but somehow am not Achieving anything meaningful till now.
    Waiting for positive Reply�s from the Forum Memebers.

    The solution has already been suggested on this thread.
    http://forum.java.sun.com/thread.jspa?threadID=686118
    If you didn't get it, here are the steps again:
    1. Read the file using java io http://java.sun.com/docs/books/tutorial/essential/io/
    2. As you read each line, use StringTokenizer or String.split() to separate each value.
    You do not need to have a predetermined text. The only predetermined thing is your delimiter.
    x

  • Help with 2D array input using a text file

    I'm writing a JAva program using BlueJ, which requires me to read info from a text file and put them into an 2D array. I got this error msg and I couldn't gifure out how to fix it. Please help me. Thanks a lot. The program is due tomorrow, please help.
    Error message:
    NoSuchElementException:
    null(in Java.util.StringTokenizer)
    Here's the program with the line where the problem is highlighted
    import java.io.*;
    import java.util.StringTokenizer;
    public class ***
    // Reads students' test scores from a file and calculate their final grade
    public static void main (String[] args) throws IOException
    String file1 = "C:/Temp/scores.txt";
    int[][] input = new int[25][5];
    StringTokenizer tokenizer;
    String line;
    FileReader fr = new FileReader (file1);
    BufferedReader inFile = new BufferedReader (fr);
    line = inFile.readLine();
    tokenizer = new StringTokenizer (line);
    for (int row = 0; row < 25; row++)
    for (int col = 0; col < 5; col++)
    input[row][col] = Integer.parseInt(tokenizer.nextToken()); --> porblem
    This is what the text file looks like:
    1 74 85 98
    2 97 76 92
    3 87 86 77
    4 73 85 93
    5 99 99 83
    6 82 84 95
    7 78 83 91
    8 84 79 84
    9 83 77 90
    10 75 78 87
    11 98 79 92
    12 70 73 95
    13 69 80 88
    14 81 77 93
    15 86 72 80
    16 70 76 89
    17 71 71 96
    18 97 81 89
    19 82 90 96
    20 95 85 95
    21 91 82 88
    22 72 94 94

    Try this code..(I've tested this code in my machine and it works fine)
              try {
                   FileReader fileReader = new FileReader("Z:\\fileInput.txt");
                   BufferedReader reader = new BufferedReader(fileReader);
                   int[][] fileData = new int[22][4];
                   StringTokenizer tokenizer = null;
                   String line = reader.readLine();
                   int rowCount = 0;
                   while(line != null) {
                        tokenizer = new StringTokenizer(line, " ");
                        int columnCount = 0;
                        while(tokenizer.hasMoreTokens()) {
                             fileData[rowCount][columnCount] = Integer.valueOf(tokenizer.nextToken()).intValue();
                             columnCount++;
                        line = reader.readLine();
                        rowCount++;
                   fileReader.close();
                   System.out.println("Done");
              } catch (FileNotFoundException fnfe) {
                   System.out.println("File not found " + fnfe);
              } catch (IOException ioe) {
                   System.out.println("IO Exception " + ioe);
    The problem with your code is that you should first check whether there are any tokens before doing nextToken().
    -Amit

  • Reading text file to make a list

    Hello ,
    I want to read a specific column from the text file if the columns are separated by a space.
    When the whole file (as shown here http://imageshack.com/a/img834/6321/k55hx.jpg
    ) is read its column1 names should be visible in a drop list and whenever a name is selected from the list it should select its respective value from column2 in the text file and display the value in indicator.
    I tried as shown here(http://imageshack.com/a/img829/6821/x4h0.jpg ) and got the output like this(http://imageshack.com/a/img840/1431/ior7.jpg ) but I am unable to get the required.
    Can someone help me out with this.
    Thanks.

    Hi stefan57.
    If I understand you correctly, you are trying to make a dropdown list with the values from the first coloum, and when the user changes this values, you want the data to be shown.
    I have made a small exampled that demostrates how this can be done.
    (please note, that this code should be further improved, eg. by implementing proper error handling)
    Best Regards
    Alex E. Petersen
    Certified LabVIEW Developer (CLD)
    Application Engineer
    Image House PantoInspect
    Attachments:
    Read Text.vi ‏25 KB
    File.txt ‏1 KB

  • How to Test, Inbound idoc ,with out the Sender System, using a Text File

    Hi Guru's .
    we wanted to test BLAORD03 inbound idoc (Message Type BLAORD).with out the SENDER SYSTEM.
    on the same client.
    we wanted to test this idoc with text file from our local machine.
    Can anyone give us detail steps.like how to create  File layout
    with Segment name,and values for the fields.how to pass this file to the system.
    Thanks in advance.

    Hi Aparna.
    My requirement is to test the idoc with Inbound File.
    Generate a file with the data entered through segments through we19 ,and use the same file for processing through we16.
    when i am trying to do this syst complaing about
    Partner Profile not available, and some times
    port not available. and some  times with
    'No further processing defined'.
    but i maintained part profiles and port perfectly.
    Can you help me in testing with test 'File' port.

  • Search and replace data in excel using a text file

    Hi Scripting Guy:
    I was able to write a script based on all the examples in your blogs, but now I'm stuck:
    I have a text file, which has Server Name, IP Address, Comments
    ABCserver1, 1.1.1.1, remote web server
    DEFserver2, 2.2.2.2, remote app server
    XYZserver3, 3.3.3.3, remote api server
    I have a excel file which has Server Name, IP Address, Protocol, Port. Server Name, and IP Addreess can repeat itself more than once in entire spreadsheet at any location: 
    Server1, x.x.x.x, TCP, 80
    Server2, x.x.x.x, TCP, 80
    Server3, x.x.x.x, TCP, 80
    My script search for the excel spreadsheet for a array that I have defined [Server1,Server2,Server3,...], once the first value matches in spreadsheet it replaces all values of Server1 with first value in text file i.e: ABCServer1 so my new excel sheet looks
    this:
    ABCserver1, x.x.x.x, TCP, 80
    DEFserver2, x.x.x.x, TCP, 80
    XYZserver3, x.x.x.x, TCP, 80
    What I also want is, when I replaces the "Server1" value in spreadsheet, it will also replace the IP Address in next column. This is where I'm stuck I can replace the value of Server Name, but I can't replace the IP address for text file. Remember
    "Server1" can be found at any place in spreadsheet but it will always have IP address column next to it.
    So this is what script will do if runs correctly:
    1) Read text file:
    ABCserver1, 1.1.1.1, remote web server
    DEFserver2, 2.2.2.2, remote app server
    XYZserver3, 3.3.3.3, remote api server
    2) Before script run on excel sheet
    Server1, x.x.x.x, TCP, 80
    Server2, x.x.x.x, TCP, 80
    Server3, x.x.x.x, TCP, 80
    3) After script run on excel sheet
    ABCserver1, 1.1.1.1, TCP, 80
    DEFserver2, 2.2.2.2, TCP, 80
    XYZserver3, 3.3.3.3, TCP, 80
    If you will see my script is able to replace "Server Name" in entire spreadsheet, but can't copy or replace the corespondent IP address in next column. 
    script:
    $text = "Server1","Server2","Server3"
    $replace=$replace = get-content C:\script\test.txt | foreach{ ($_.split(","))[0]}
    $File = "C:\script\test.xlsx"
    $now = [datetime]::now.ToString("yyyy-MM-dd")
    #$now = get-date -Format "MM-dd-yyyy_hh:mm:ss"
    copy-Item C:\script\test.xlsx test_$now.xlsx
    # Setup Excel, open $File and set the the first worksheet
    $i=0
    $Excel = New-Object -ComObject Excel.Application
    $Excel.visible = $true
    $Workbook = $Excel.workbooks.open($file)
    $Worksheets = $Workbooks.worksheets
    $Worksheet = $Workbook.Worksheets.Item(1)
    $Range = $Worksheet.UsedRange
    Foreach($SearchString in $text){
    if ($Range.find("$SearchString")) {
        $Range.replace($SearchString,$replace[$i])
    else {$i++}
    $WorkBook.Save()
    $WorkBook.Close()
    [void]$excel.quit()
    Really appreciate your help
    Thanks

    Hey, thanks for helping me out. I checked online and I had to call Cells this way:
    $Workbook.Worksheets.Item(1).Cells.item($cell.Row,$cell.Column+1)=$test[$i].replace
    one problem I'm having is if the $server value repeats itself in excel sheet again it doesn't replace all values for lets see server 1:
    so thats how my excel sheet is:
    server1
    x
    TCP
    80
    server2
    x
    TCP
    80
    server3
    x
    TCP
    80
    server1
    x
    TCP
    80
    when I run the script thats how results looks:
    server1
    x
    TCP
    80
    REM88888SQL301A
    10.1.1.2
    TCP
    80
    REM88888SQL301B
    10.1.1.3
    TCP
    80
    REM88888SQL301
    10.1.1.1
    TCP
    80
    It skipped the line one on excel and changed value in line 4. As I mentioned above this can repeat $server can repeat itself anywhere in spreadsheet in multiple columns.
    What can we do so it will replace all server name and IP address?
    Thanks

  • Using a text file to provide paths for copy-item

    OK, please don't beat me over the head too much for this.  I've been trying various methods all day and nothing seems to be working.  I have a text file that has a list of folders (full paths including drive letters).  I want to copy all of
    the folders and their contents to another path.  The destination path does not change (d:\temp\destination) but each of the folders in the text file should end up being subfolders of the destination
    Sources:
    d:\source1 (has file01.txt, file02.txt)
    d:\source2 (has file03.txt, file04.txt)
    Destination
    d:\temp\destination
    Should end up with something like
    d:\temp\destination\source1\file01.txt
    d:\temp\destination\source1\file02.txt
    d:\temp\destination\source2\file03.txt
    d:\temp\destination\source2\file04.txt
    I've tried
    Copy-Item -Path D:\Temp\paths.txt -Destination D:\Temp\attachments -recurse -Force
    which just copies the file that contains my paths.
    I've tried
    foreach($line in (Get-Content -Path D:\Temp\paths.txt)){$dest_folder="D:\Temp\destination" | Copy-Item -path $line -destination $dest_folder -force -recurse}
    I get an error that says
    Copy-Item : The input object cannot be bound to any parameters for the command either because the command does not take
     pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
    Would anyone mind helping me out here?  If I need to create a file that has the source foldername as well as the destination path, I can do that, I just figured there has to be a way to specify a base directory and let powershell do its thing from there.

    try
    $SourcesFile = ".\sources.txt"
    $Root = "d:\temp\destination"
    foreach ($Source in (Get-Content -Path $SourcesFile)) {
    $Destination = Join-Path -Path $Root -ChildPath (Split-Path -Path $Source -Leaf)
    Copy-Item -Path $Source -Destination $Destination -Recurse -Force
    Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) _________________________________________________________________________________
    Powershell: Learn it before it's an emergency http://technet.microsoft.com/en-us/scriptcenter/powershell.aspx http://technet.microsoft.com/en-us/scriptcenter/dd793612.aspx

  • Using part of file name in a query

    I am putting together a member directory and I want to be
    sure that it is best optimized for the search engines (SEO). For
    this reason, I want to appear to have a unique file for each
    member. I know that I could just link to a master file and pass the
    member ID like:
    www.example.com/memberdir/member.cfm?memberid=12345, but that isn't
    the best solution.
    To be better optimized I would like to use the following
    format
    www.example.com/memberdir/12345.cfm
    The trouble that I am running into is that the files
    (12345.cfm, 12346.cfm, etc.) don't exist and return 404's. The only
    directory on our site that will handle things in this way is the
    'memberdir' folder. I am trying to handle this without making an
    modifications to IIS or using ISAPI Rewrite.
    I am using the application.cfm file to handle the processing
    (the query that pulls the member data). However, I haven't been
    able to get around the issue with the file not existing. Again, I
    don't want to have to make changes with IIS and I don't think it
    will be the most efficient to check for the existance of the file
    and writing a shell file (an empty .cfm file for all members) to
    avoid the 404. I could do that, but it doesn't seem like the ideal
    solution.
    If anyone has any guidance to point me in the right
    direction, it would be much appreciated.
    Thanks,
    roblaw

    There was a suggestion made to me, that I used, previously on
    this board. It works really well. Have a page which checks for the
    existence of the necessary perameters in the url, and parses
    accordingly. If the necesary parameters aren't located in the url,
    then return a 'personalized' 404 error.
    However, the long and the short of it is that you will have
    to redirect all 404 errors from the server to hit this special .cfm
    page. (you will have to edit the IIS.)
    If you want to see this in action, go to
    www.jxp.com/whatever, since the call would be a '404' error, it
    calls the special 404 page that I created, which checks the dB for
    the 'whatever' if 'whatever' exists then it shows the necessary
    page, otherwise I return a listing of other possible 'hits' you may
    have been looking for. This will give you an idea of what you can
    do.
    Hope this helps.

  • Using a *.wmv file to make a home video playable in DVD players?

    Hello all,
    I have a project where I need to take a slide show made in Windows Movie Maker (.wmv file extension), burn it to a CD and make it play in a DVD Player. I attempted just making a VCD on a Windows XP laptop, but it wouldn't play in the DVD Player.
    So, can iDVD '08 help me out with this? I have a Mac Mini, but I'm not very familiar with it yet (bought it just a few months ago).
    For clarification, what I need to do with this wmv slide show is burn it to a CD that can be played in a DVD player and I'm wondering if iDVD can do this for me. Already attempted making a VCD on a Windows XP laptop.
    Any ideas would be great. Thanks!

    Hi
    She can't play it on her PC, nor can I at home.
    Ussually it's a DVD player software matter. Windows Media Player is not intended for DVD playing. Did you use it when it failed? Try with the free and cross plataform VLC Player.
    Some threads in DVDSP forum about PC compatibility (USe the search feature and you'll find a lot of them):
    http://discussions.apple.com/thread.jspa?messageID=6020916
    http://discussions.apple.com/thread.jspa?messageID=6095226
    http://discussions.apple.com/thread.jspa?messageID=6374894
    EDIT: SOMETHING ELSE
    * DVD Authoring/burning tips thread *
      Alberto
    Message was edited by: Silal

  • Uploading a text file from webi filter area as part of the query condition

    Post Author: balasura
    CA Forum: Publishing
    Requirement : Uploading a text file from webi filter area as part of the query condition Hi, I am in a serious requirement which I am not sure available in BO XI. Can some one help me plz. I am using BO XI R2, webi I am generating a ad-hoc report, when I want to give a filter condition for a report, the condition should be uploaded from a .txt file. In the current scenario we have LOV, but LOV could hold only a small number of value, my requirement is just like a lov but the list of values will be available in a text file ( which could number to 2000 or 2500 rows). I would like to upload this 2500 values in the form of a flat text file to make a query and genrate report. Is it possible in BO XI? For Eg:- Select * from Shipment Where u201CShipment id = u2018SC4539u2019 or Shipment id = u2018SC4598u2019u201D The u201Cwhereu201D condition (filter) which has shipment id will be available in a text file and it needs to be loaded in the form of .txt file so that it will be part of the filter condition. Content of a .txt file could be this shipment.txt =============== SC4539 sc2034 SC2343 SC3892 . . . . etc upto 2500 shipment Ids I will be very glad if some could provide me a solution. Thanks in advance. - Bala

    Hi Ron,
       This User does not have the access to Tcode ST01.
       The user executed Tcode SU53 immediately following the authorization failure to see the authorization objects. The 'Authorization obj' is blank and under the Description it has 'The last Authorization check was successful' with green tick mark.
      Any further suggestions, PLEASE.
    Thanks.

  • Using text files to commit in UCM trough ODC

    Hello all.
    Again I'm here trying to do something not that common in ODC. So let me explain what I want, OK?
    EDIT: I forgot my specs:
    We're using a VM in Vmware Workstation, with Windows 7 64bits. 80gb HD and 2gb RAM.
    Using Oracle DB 11gR2 and ODC 10.3.5.
    1 - we already opened a ticket with Oracle Support, the number is:      SR 3-6108348211. It was closed but we reopened. Until now there is no answer.
    2 - we have many files and want to do a test: which is better for commiting files to UCM, ODC or Batch Loader? We know that Batch Loader is the usual one, but maybe there is some speed improvement in checkin using ODC.
    3 - we will get the images from a folder, send them to ODC, them the ODC will commit them to UCM, but (here's the catch) sending all the metadata of all those files and being pdf-searchable too.
    4 - so I'll put some screens here, showing what I did, OK?
    4.1- In this screen I show all the metadata we need ship to UCM. In the left column are the metadata fields and in the right, values.
    http://imgur.com/QkqxA
    4.2 - In a folder, we put all the images and a text file containing all the metadata referring to the images. As the Oracle Support says, the text file should have these lines.:
    C:\Users\oracle\Desktop\import_folder\import1.bmp|4545|Joaozinho|15/15/2015|Import1|11.11
    C:\Users\oracle\Desktop\import_folder\import2.bmp|7878|Augusta|16/16/2006|Import2|22.22
    C:\Users\oracle\Desktop\import_folder\import3.bmp|00001|Gump|07/07/1997|Import3|333.33
    C:\Users\oracle\Desktop\import_folder\import4.bmp|3232|Filipinho|08/08/2008|Import4|99.66
    C:\Users\oracle\Desktop\import_folder\import5.bmp|7968|Maria Antonia|15/12/1985|Import5|88.99
    C:\Users\oracle\Desktop\import_folder\import6.bmp|56563|Coloral|07/08/1999|Import6|22.33
    The first part is the image path, the second its name, the third the odc_Numero field value, the fourth the odc_Nome field value, the fifth the odc_Data field value, the sixth odc_Assunto field value and the last one is odc_Valor field value. All these values need to go to UCM, along with the images.
    4.3 - So I put all the images and text file in the folder.
    http://imgur.com/o8znD
    4.4 - I didn't checked OCR zones in the images, all the metadata values will come from the text field, because the images sometimes cannot be trusted on the values (bad scan, old text, etc).
    4.5 - I open the Import Server and did this config:
    http://imgur.com/t0xh1
    http://imgur.com/PPJGO
    http://imgur.com/RWKfU
    http://imgur.com/v5P5V
    4.6 - I open the commit server and let it in the background, but not doing any batch jobs, just paused.
    4.7 - And I got all these errors:
    http://imgur.com/lCtql
    EDIT: I get this error when opening the import server, just after the login:
    http://imgur.com/YB9qe
    So here are my questions:
    1 - what printer is the ODC needing?
    2 - the files will be already converted to PDF, I just want to convert them to pdf-searchable, using the text file containing all metadata and send them to UCM. Is it possible?
    3 - is there a guide teaching how to do this?
    4 - am I doing it right?
    Sorry for the long post, but this an urgent test.
    Thanks (seriously, a lot) for all the help and guidance.
    Fernando
    Edited by: fgomes on 03/09/2012 05:45
    Edited by: fgomes on 03/09/2012 06:05

    Hello Jiri.
    Thank you for your answer.
    Let me answer your questions (I'm using Windows XP this time, both we are working in 7, too):
    - while working with the ODC desktop application, you
    i. define a file cabinet (which also defines index fields, optionally batch statuses, and commit profiles)
    -- I did, these are the fields I've defined for this test
    http://imgur.com/7FyRW
    ii. define a scan profile
    -- Here are the configs for my scan profile
    http://imgur.com/Jyfvh
    http://imgur.com/tveQZ
    iii. define an indexing profile (which also defines index fields the profile will work with, incl. auto-populate)
    -- Here are the configs
    http://imgur.com/sMW7Q
    http://imgur.com/O6Bpl
    http://imgur.com/aqYmU
    http://imgur.com/lGrIn
    http://imgur.com/MeKAz
    http://imgur.com/t5r0i
    http://imgur.com/oco5S
    iv. scan an image (using the scan profile) AND v. index the scanned image (using the indexing profile)
    -- I did that. And works ok. I've used OCR zones sometimes and the scan worked. Sometimes I didn't and it worked OK, too. Commit to UCM in both cases worked. Here is an example:
    http://imgur.com/rp3jI
    http://imgur.com/ZS7FB
    http://imgur.com/cHhjC
    * Now we go to another part:
    ii'. define the import batch job
    -- I've defined in here.
    http://imgur.com/hfEdR
    http://imgur.com/eKsyJ
    http://imgur.com/Kbt5R
    http://imgur.com/1azWe
    http://imgur.com/JqbFI
    http://imgur.com/IkCww
    And after all these configs, when I click in "Activate", to make the batch job active the application closes (in Windows 7 it crashes). When I change to "Import from folder" rather than "Import from list file" I get these messages:
    http://imgur.com/1WAMn
    http://imgur.com/oz5ZL
    Still, the other way is to provide metadata in a List File. The List File is processed directly by Import - and if you define that you want to commit right away Import will try to do it for you, but it can succeed only if it has all the necessary index fields. The error you are getting indicates that something is wrong - it can be that some field is missing, or all the fields are missing (List File could not be parsed).
    -- Yes. It is that I want. I don't need to check barcodes at all. I just need to get all the field values from a list file. Here are the contents:
    C:\Documents and Settings\Administrador\Desktop\import_folderimport1.bmp|4545|Joaozinho|15/15/2015|Import1|11.11
    C:\Documents and Settings\Administrador\Desktop\import_folderimport2.bmp|7878|Augusta|16/16/2006|Import2|22.22
    C:\Documents and Settings\Administrador\Desktop\import_folderimport3.bmp|00001|Gump|07/07/1997|Import3|333.33
    C:\Documents and Settings\Administrador\Desktop\import_folderimport4.bmp|3232|Filipinho|08/08/2008|Import4|99.66
    C:\Documents and Settings\Administrador\Desktop\import_folderimport5.bmp|7968|Maria Antonia|15/12/1985|Import5|88.99
    C:\Documents and Settings\Administrador\Desktop\import_folderimport6.bmp|56563|Coloral|07/08/1999|Import6|22.33
    Thanks again.
    If you could try and say what I am doing wrong you'll help me a lot.
    Fernando

  • Using get -childitem to scan different drives but append one text file.

    Folks,
    I need some help, I wrote the below to scan for certain files types on client machines but I cant get the script to scan anything other than the C$. Apologies I am kind of new to scripting and PS. The script is kicked off using a batch file that makes the
    CR directory on the clients. I think the problem stems from the $disk variable or the way I have used foreach or the loop - any help would be appreciated
    Set-Location c:\
    $s = new-object -comObject SAPI.SPVoice
    $s.speak("Scan in progress")
    write-host "Scanning for files"
    $Extension = @("*.zip","*.rar","*.ppt","*.pdf")
    $Path = "$env:userprofile\Desktop"
    $disks = "c$","D$"
    foreach ($disks in $disks)
    {get-childitem -force -include $Extension -recurs -EA SilentlyContinue >$path\files.txt}
    $s.speak("Scan completed")
    Write-Host "Scan complete, files.txt created on your desktop - press any key to exit..."
    $x = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyUp")
    Remove-Item c:\cr -recurse -EA SilentlyContinue
    exit

    Then there is "$x = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyUp")" , What are you trying to do?
    It looks like that is a pause after the code has completed to give the user time to read the message prior to exiting.
    @V Sharma - As gaff-jakobs has mentioned, there are some parts of your code that need looked at. Please take a look at what he has advised and make the necessary changes. If you want to append data (doing a > does not append), I would look at using Out-File
    -Append -FilePath <filename> to accomplish this. 
    Usually, outputting to a text file limits what you can do with the output once it goes to a text file. I would look at using something such as Export-Csv -Append -Path instead (assuming you are running PowerShell V3). But if saving to a text file is part
    of your requirement, then just be sure to use the first method I mentioned.
    Boe Prox
    Blog |
    Twitter
    PoshWSUS |
    PoshPAIG | PoshChat |
    PoshEventUI
    PowerShell Deep Dives Book

Maybe you are looking for

  • Mini DisplayPort adapter for 23" Apple HD Cinema Display?

    hello all, I'm sure this has been asked and answered here, but I can't find it! (grrr...) I've got an 23" Apple HD Cinema display and a new iMac. The iMac has a "mini DisplayPort". Now, the Cinema Display has a 3 x 10-pin (not 3 x 8) DVI connector. T

  • IPad not loading all the way,esp.on FB ,on wireless,it's working just fine but this

    I 'm having a problem trying to load all my pages on Facebook,......iPad loads maybe one full screen here and then won't go any further. Even the loading circle ....lefthand top.....won't go round? I'm getting my emails in just fine ,it's just a few

  • Mac Help hangs again

    This happens not infrequently. When it does I routinely apply Dr.Smoke's remedy of trashing 3 library files and rebooting and Mac Help works again. However this time it displayed the alpha index but declined to open the page corresponding to the lett

  • Ipad2 won't send mail

    No matter which account I choose I consistently get the error message that the "recipient was rejected by the server".  On the Apple site it was suggested that all other servers be turned on, they are.  It was also suggested I turn the iPad off and t

  • Installation error - PSE 11 for mac

    Hi! This week I've bought the Photoshop Elements 11 for mac. Yesterday I tried to install the programm from the DVD but erverytime there is an error at the end of the process (99%). "Please verify that the disk image is not corrupt AdobePhotoshopElem