File dis-ambiguation

Situation as follows:
Customer uses the same filename (say 'purchaseOrder') to represent two quite distinct text files. One text file ('purchaseOrder') we'll call a 'type 1' purchase order, and another 'purchaseOrder' being a 'type 2' purchase order. Both text files are quite different i.e. use the same / similar fields, but these are arranged in different ways and different ordering in each file type.
In summary:
'purchaseOrder' being the one common filename, can either be:
--> puchaseOrder (type 1); with structure 'A'
--> purchaseOrder (type 2); with structure 'B'
Each type really should be identified using separate filenames, but unfortunately they share the same file name.
When using the File/FTP 'native format builder', separate schemas are created, one for 'type 1' purchase order (structure 'A') and a different one for 'type 2' purchase order (structure 'B'), given each purchase order type has a different delimitation / structure. This is fine, but unfortunately doesn't help as the FTP adapter will not be able to determine which XML schema to apply given the one common filename when performing the 'get' operation on 'purchaseOrder'.
I could ask the customer to change the filename ('[purchaseOrder') i.e. a purchase order could be generated with separate filenames depending on which type e.g. 'purchaseOrder-type1'  for structure 'A' and 'purchaseOrder-type2' for structure 'B'. This would ensure I could assign separate 'native format builder' XML schemas to each separate file type for the FTP adapter - the problem is, doing this is a substantial application change for the customer and therefore not a viable option at this point.
If I'm stuck with 'purchaseOrder' which can be either file type, how can I dis-ambiguate which purchase order type, so the correct XML schema can be used to map the text file (either a 'type 1' or a 'type 2' file)?
I've thought of a way but it's a hack and only a partial solutions given it may not be able to deal with unexpected errors very well e.g.
- FTP 'get' purchaseOrder using the 'opaque' option (base-64 encoded)
- Use BPEL's embedded Java capability to open the file and determine (based on a pre-determined set of ordered attributes) which type it really is i.e. use Java in BPEL to dis-ambiguate
- Write the file to a local file system (via outbound File adapter) using a relevant filename which captures the semantic meaning of the file e.g. 'type 1' or 'type 2' purchase order
- Read the file (with the 'correct' filename) from the local file system (via inbound File adapter). We then know which purchase order type we are dealing with
- Continue processing as required...
I am looking for a better solution / option / variation of the above, assuming one is available...                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

user650847 wrote:
Hi Johan,
Sample sanitised data as follows:
PURCH 112834502469011 1O130/10/0707:53:42
PURCH 112831000061733 1O208/11/0711:15:22
PICKS 112830010263742 0001O3V307
The first 2 file samples have the same structure and different content. The third one has a different structure.Create a schema that has two complex types, one for each type of data. Then create a root element with a choice-group between the two types like this:
<xsd:element name="purchaseOrder">
<xsd:complexType>
<xsd:choice maxOccurs="unbounded" nxsd:choiceCondition="fixedLength" nxsd:length="5">
<xsd:element ref="purchaseOrderType1" nxsd:conditionValue="PURCH"/>
<xsd:element ref="purchaseOrderType2" nxsd:conditionValue="PICKS"/>
</xsd:choice>
</xsd:complexType>
</xsd:element>
See [http://download-uk.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/nfb.htm#CCGDIIFC] for details and other similar samples.
Regards,
Johan Rydström

Similar Messages

  • Cannot find file .dat

    After using perl script, will produce three files
    1. file .ctl
    2. file .log
    3. file .dis
    When using sqlloader to load file , error messages founds
    " file .dat cannot be found ". I see that the data that to loaded to table is on file .dis. I usually edit the file .ctl first and add line
    infile 'directory where file .dis located'

    Sounds like you have the old version of Creator. Please go to this website (you need to join the Sun Developer Network first) and download the new one which will not require you to have any license:
    http://developers.sun.com/prodtech/javatools/jscreator/downloads/index.jsp
    Thanks.

  • Need to put contents of a file stored in FileContents into a ByteBuffer

    As the title says. Can't find anything that shows how to accomplish the goal. Have a file stored in FileContents. Need to get the file's contents into a ByteBuffer. Any ideas how I can go about doing that?
    Thanks

    That's all the spoon feeding I needed. Minus the DataInputStream, I was dead on with my final "guess" before the previous reply. DataInputStream was what i needed to complete the puzzle. Here's the final code (simple enough):
    int length = new Long(fileContents.getLength()).intValue();
    byte bytes[] = new byte[length];
    InputStream file = fileContents.getInputStream();
    DataInputStream dis = new DataInputStream(file);
    dis.readFully(bytes);
    ByteBuffer buf = ByteBuffer.wrap(bytes);
    Thanks for the spoon feeding.
    Edited by: wizzbangca on Feb 1, 2008 4:07 PM

  • I have PSD cc, had cs6 ex I did update, can't open ANY files,  reinstalled, same..dead programs

    I work in PSD cs6 ex constantly,  recently i updated to cc. Today I can't open any file, Recent, old, make a new, open from library etc,  NOTHING will open, i request a file,  programs act like they are deaf. They arent locked up, just dont open or create files.  Everything on my com is updated. I uninstalled and reinstalled and same. Also I no longer can install cs6 just CC.

    Even the term "I request a file" is ambiguous.
    Please provide more detailed information.  Ideally, show some screen grabs showing what you are seeing, and identify just what isn't working.
    Best advice, based on no specific information provided, is to uninstall using the Adobe Creative Cloud Cleaner Tool / process to really get everything cleaned up, then reinstall.
    http://www.adobe.com/support/contact/cscleanertool.html
    -Noel

  • Parameters in Report Viewer?

    Post Author: SBurbacher
    CA Forum: General
    Kind of new to Crystal Reports, so this may be a dumb question...I'm using Crystal Reports XI Release 2 and Reports Viewer XI.  I've created a report with parameters that determine record selection.  When I open the report in Report Viewer, it doesn't allow me to select new parameter values like I had hoped.  Is this normal operation, or am I missing something?  If this is normal, what options do I have to allow my end users to take full advantage of selecting parameter values without having to set them up with a full version of Crystal Reports? Thanks!

    Post Author: Ron
    CA Forum: General
    I have instaled Windows 2003 Standard Edition SP1, Crystal Reports Server XI R2 + SP2 + BOXIR2  FixPack2.6. And my CR Server doesn't allow me under Administrator account Edit parameters value from the (2nd way) of my post above - just rise an error "Internal Error. Internal Error Occurred" and i cant do something, so i don't know that i can do insteed of the (1st_way) described above myself. Even the (1st way) works time by time, so sometimes i receive en error on CR Viewer:CrystalReportViewer List of Values failure: fail to get values. &#91;Cause of error: Error in File dis > region > school: Database Connector Error&#93;Unable to retrieve Object.List of Values failure: fail to get values. &#91;Cause of error: Error in File dis > region > school: Database Connector Error&#93; there dis, region and school are my dynamic parameters. My database connection is properly setted - i check it by running the same report with any pre-setted values (all this time Server doesn't prompt me for  new parameters values, it use old values from CR Designer). During publish reports i've tryed any cases, there are: (checked/unchecked)  Save Reports Data, (checked/unchecked)  Use Repository for refresh Data, (checked/unchecked)  Allow user to refresh reports and many other cases. The result is one - it doesn't work properly. So i don't know how to work later if every report require so difficult actions from  creating to publishing and using :(. 

  • StreamTokenizer parsing dates?

    Hi,
    I have a csv file which I am trying to parse, it looks like this.
    2002-03-01,2002-03-02,2002-03-03
    301111,10,20,30
    301112,30,40,50
    etc...
    the first line represents the dates, and the subsequent lines each represent
    a student, followed by the marks gained on the dates (I know they are a
    column out, but 301111 got 10 on 2002-03-01, 20 on 2002-03-02 and so on).
    I have a parser written using StreamTokenizer which works fine parsing the
    students and marks when I don't have the top line with the dates, but now I
    am trying to add the date functionality with no luck.
    It seems to be parsing the file and seeing the initial 2002's as a number
    tokens and ignoring the -03-01 etc, how can I tell it to see every single
    character except the ',' as regular characters? I presume it's doing this
    because it is thinking that '-' is a delimiter of some sort.
    Also, once I have told it to ignore everything except a ',' as delimiters,
    will it then see '2002-03-01' as a TT_NUMBER or a TT_WORD? I really could do
    with it being seen as a TT_WORD if posible.
    Here is the code I'm using:
    try
    FileReader fr = new FileReader(filename);
    StreamTokenizer in = new StreamTokenizer(fr);
    in.whitespaceChars(',',',');
    boolean EOF = false;
    boolean gotStudent = false;
    //some variables to put the row data in
    ArrayList rowData = new ArrayList();
    String thisStudent = null;
    while(!EOF)
    if (in.nextToken() == in.TT_NUMBER)
    double d = in.nval;
    int i = (int)d;
    String s = Integer.toString(i);
    if(s.length() <= 3) //found mark
    if(gotStudent == false)
    System.out.println("ERROR: File in incorrect format");
    return;
    else
    rowData.add(s);
    else //found student ID
    if(gotStudent == false)
    thisStudent = s;
    gotStudent = true;
    else
    //found a new student, so store old data and start new one
    studentBlock.add(new StudentRow(thisStudent,rowData));
    rowData = new ArrayList();
    thisStudent = s;
    else if (in.nextToken() == in.TT_WORD)
    //Trying to see the dates, just want to print them out for the time
    being
    String s = in.sval;
    System.out.println(s);
    else if (in.nextToken() == in.TT_EOF)
    //Store last dataItem
    studentBlock.add(new StudentRow(thisStudent,rowData));
    EOF = true;
    catch (FileNotFoundException e)
    System.err.println("ERROR: File not found - " + e);
    catch (IOException e)
    System.err.println("ERROR: Error reading input file - " + e);
    Richard

    Your input file contains ambiguous data (all tokens including dates appear to be numeric to the streamstokenizer). I suggest that you read the data one line at a time and parse each line with the StringTokenizer as shown below (note that since dates only appear in the first line, I parse that line outside of the student loop):
    import java.io.*;
    import java.util.*;
    public class testStreamP {
       public static void main(String[] args) {
          String in;
          double studentID;
          double[] marks;
          int i;
          try {
             BufferedReader fr=new BufferedReader(new FileReader(args[0]));
             in=fr.readLine();
             StringTokenizer c=new StringTokenizer(in,",");
             int cols=c.countTokens();
             String[] dates=new String[cols];
             marks=new double[cols];
             for (i=0;i<cols;i++) {
                dates[ i ]=c.nextToken();  // fetch the dates from first line
                System.out.println(dates[ i ]);
             while ((in=fr.readLine())!=null) {
                c=new StringTokenizer(in,",");
                studentID=Double.valueOf(c.nextToken()).doubleValue();
                System.out.println(studentID);
                for (i=0;i<marks.length;i++) {
                   marks[ i ]=Double.valueOf(c.nextToken()).doubleValue();
                   System.out.println(marks[ i ]);
          catch (FileNotFoundException e) {
             System.err.println("ERROR: File not found - " + e);
          catch (IOException e) {
             System.err.println("ERROR: Error reading input file - " + e);
    }V.V.

  • My Mac Pro crashed, cannot located harddrive.  Is there anyway of locating my photos?

    My Mac Pro crashed a few days ago.  I did have Maverick installed.  No back up - shame on me.  Cannot locate my harddrive.  Cannot boot into SafeMode.  Apple installed a another hard drive and I have the old one.  Any recommendations.  They said for another $99 they can try to locate but not guaranteed.  Any other suggestions?  Thanks!

    Drivesavers or any other company should be the safest way to recover data, but professional help comes at professional prices.
    If you want to do is on your own do some reading first, there are many important things to consider…
    Remove the HD untill you are ready to recover.
    You must avoid writing to the bad disk at all costs - it can overwrite data you are hoping to recover. Ideally a write blocker can be used or tools like https://github.com/aburgh/Disk-Arbitrator (the binary download is a compiled app) can be used to disable automatic read-write mounting of disks.
    You will need a spare HD to recover onto, it should be larger than the original disk (many tools find duplicates of data).
    If the HD's hardware is failing/failed using the disk can do much more damage to the data, clicks, buzzing, scratching, humming or any other strange noises may indicate a hardware failure - the pros would take it apart in a clean room & rebuild it, possibly using parts from a spare HD of the same type. You cannot do this, so look for pro help if the symptoms fit. You may only get one chance to read if the HD is severely failing.
    Data or file corruption may be recoverable if the HD is in working order, tools that scan for file headers should find files, but you may well lose the metadata (filename, labels, spotlight comments, modification dates…). You may need to sift through a lot of files with ambiguous names.
    It's possible just the disk catalog is damaged - so tools like Disk Warrior may be able to find the files & rebuild the catalog structure, this will maintain the filenames & most of the metadata - it may even make it bootable again.
    Disk Drill will allow you to look a the state of the HD for free, and then purchase to recover (if the HD is working).
    http://www.cleverfiles.com/
    There are many other apps, check similar threads here for other options.
    There are open source tools that will work via the command line, so you need to be comfortable in Terminal…
    http://www.cgsecurity.org/ has photorec & test disk - both are excellent IMO, check the FAQ & Wiki on that site for more info.  Photorec is a 'filecarver' & testdisk looks for partitions to recover/ rebuild.
    ddrescue and dd_rescue (they are different & I can't remeber which is best) - these attempt a 'full block copy' to another disk. This can allow you to try recovering files from a new clone of the bad disk - saving any wear & tear on failing hardware. You can also try multiple repair tools since you would be working on a 'block for block copy' of the damaged disk.
    Forensics sites may have info if you specifically look at the data recovery & 'file carving' info.
    http://www.forensicswiki.org/wiki/Main_Page
    http://www.forensicswiki.org/wiki/Carving
    There is also the option of Linux live CD's that have a lot of file recovery tools pre-installed, but you would need to look for ones with HFS+ extended support.
    There is a lot to consider, good luck

  • How To "Dis-Embed" An Image From A .Doc File?

    Here's what I've got. I'm working on a flier to be commercially printed; it contains four photos and text. I've spent a lot of time with each photo, massaging each in PE to maximize its quality when it's eventually printed, and saved each as a TIFF file. (I "tested" each photo's printability by seeing what I get when I run it through my own Canon printer.)
    What I did next was to take each of the PE processed photos, moved them into a MS Word text document, and saved the final flier results as a .doc file.
    Now, my commercial print shop tells me that better results might be obtained by me giving them the original photos instead of using the completed .doc flier. This is where my problem arises: it's not that I saved too few, I saved too many versions of the photos as works in progress. It's not an easy job to find the exact photo version I, finally, embedded. Using the "direct" approach doesn't seem to work. When I drag a photo out of its .doc file onto my desktop and try to open this clipping (PICT) with PE I get the message, "Could not complete your request because an unexpected end-of-file was encountered."
    So, how do I "dis-embed" each of these photos from the original document and without losing all the nice stuff I did when first processing them? Any comments and suggestions will be appreciated.
    Dave

    Hi,
    You can use javax.swing.ImageIcon like so
    ImageIcon anIcon=new ImageIcon("image.jpg");Hope thats what you are after,
    Cheers
    Jack573

  • Internal application error  in scheduling a  dis file.

    Hi friends , i am finding a problem in scheduling a dis file . discoverer says internal application error. some other files are getting scheduled. the query for this particaular report is long. any suggestions , its urgent.................. discoverer version is 4.1 .

    Hi,
    Not really enough informatoin to debug via a forum, you may need a Service Request with Oracle Support. If staying on 4.1 (recommend 10.1.2.0.2), I suggest you are patched to 4.1.48.08
    Additionally, ensure the worksheet runs to completion without scheduling... since you mentioned it is long you may need to increase the data cache, etc.
    Regards,
    Steve

  • "Read From Binary File" function Help ambiguity

    I must be getting tired, but for some reason a doubt crept in my mind as I was designing a new piece of code this morning:
    "is the "Read From Binary File" using the last file position or is it starting from the beginning of the file?"
    "That's a stupid question", I told myself.
    "I used this function a million times and have always assumed it is reusing the last file position. Moreover, there is no file offset input to that function, so WTH am I afraid of?"
    So, for kicks, I fired up the Help window and read the following description (*):
    Reads binary data from a file and returns it in data. How the data is read depends on the format of the specified file. This function does not work for files inside an LLB.
    (*) BTW, has anybody ever complained that you can't select and copy anything from the floating Help Window?
    Not much there. I particularly admire the phrasing of the second sentence... What about: "This function can do a lot of things, but it would much to complex to describe this is extensive details, so if you are asking, you probably can't afford using it"?
    Anyhow, I clicked on the "Detailed Help" and got this (among other things):
    Use the Set File Position function if you need to perform random access.
    WHAT? I am pretty darn sure I DO NOT USE the Set File Position when I read a file in successive and contiguous chunks. I just pass the file refnum into a shift register and back to the function and that's it.
    Now, the description of the "Refnum Out" ouput says: If file is a refnum or if you wire refnum out to another function, LabVIEW assumes that the file is still in use until you close it. Translated in plain English, is that supposed to mean that if the file is not closed it is open, or is that implying that it contains more info that just "the file is open and can be found here"?
    I started searching around and finally ended up with the entry for "refnums, file I/O". Down the bottom of the (long) article, I found this under the heading "References to Objects or Applications" (but nothing specific to files, BTW):
    ...LabVIEW creates a refnum associated with that file, device, or network connection...
    [...]  LabVIEW remembers information associated with each refnum, such as the current location for reading from or writing to the object and the degree of user access, so you can perform concurrent but independent operations on a single object. If a VI opens an object multiple times, each open operation returns a different refnum. LabVIEW automatically closes refnums for you when a VI finishes running, but it is a good programming practice to close refnums as soon as you are finished with them to most efficiently use memory and other resources.
    So it seems that my recollection was correct. I do not know what the "degree of user access" for a file is, but that's not the topic of today's post. 
    So, my point is: the Help File for this function is incomplete or ambiguous at best. Please correct it. And provide a link to the "refnum, file I/O" Help entry in its detailed Help. It would H E L P...
    Thanks for reading.

    Reading in succesive chunks is *NOT* random access. An open file always has
    a current position, which is updated with each read or write operation.
    You only need to set the file position if you want to start elsewhere.
    LabVIEW Champion . Do more with less code and in less time .

  • .Dis Files getting errored out while migrating

    Hi
    During migration of a .dis file from one instance to another instance
    while opening a .dis file i am facing this error :
    Cannot join tables used in the workbook.Item dependency "" not found in the EUL.
    Please help me.
    Regards
    Nakul Venkataraman

    First thing to check is that the EULs on both instances are the same. I would suspect that there is a missing join in the target instance between folders.

  • Discoverer command line +Deploying eex und Dis Files?

    Hi,
    i have one eex file and one dis file. Now i want to deploy the two files from an development server to an integration server.
    Is there any command line interface to deploy the two files?
    thanks a lot
    Wolle

    Hi,
    I've been exporting and uploading Disco objects including workbooks via disco admin command line / .eex files (not .dis) at least since 4i (if not 3i, but can't remember that far back now!), windows cmd file e.g. for Disco 10g
    Download:
    set PATCH=XXXX_YYYY
    set DISCO=C:\oracle\BIToolsHome_1\BIN\dis51adm.exe
    set USER=EULOWNER
    set RESP=XXXX Resp
    set DB=DEV
    set PASS=password
    %DISCO% /CONNECT "%USER%:%RESP%/%PASS%@%DB%" /APPS_USER /EUL EUL_US /IDENTIFIER /EXPORT XXXX_YYYY.eex XXXX_YYYY_BA /WORKBOOK XXXX_YYYY_WB /WORKBOOK XXXX_YYYY_2_WB /FUNCTION XXXX_YYYY_FN /LOG %PATCH%_DL.log /SHOW_PROGRESS
    Upload (I don't hardcode db/password, use a bit more trickery, but you get the idea):
    set EEXFILE=XXXX_YYYY
    set PATCH=XXXX_YYYY
    set DISCO=C:\oracle\BIToolsHome_1\BIN\dis51adm.exe
    set USER=EULOWNER
    set RESP=XXXX Resp
    set EEXNAME=XXXX My BA
    set db=TEST
    set pw=password
    REM Sometimes better to delete BA first, but you need to be sure noone is changing the BA in target env
    REM %DISCO% /CONNECT "%USER%:%RESP%/%pw%@%db%" /APPS_USER /EUL EUL_US /DELETE_BUS_AREA "%EEXNAME%" /LOG %PATCH%_UL.log /SHOW_PROGRESS
    %DISCO% /CONNECT "%USER%:%RESP%/%pw%@%db%" /APPS_USER /EUL EUL_US /IMPORT %EEXFILE%.eex /REFRESH /LOG %PATCH%_UL.log /SHOW_PROGRESS /IDENTIFIER
    Regards,
    Gareth
    http://garethroberts.blogspot.com

  • Archive .DIS files

    Looking for a way to migrate a database of .DIS files to a new location so we can delete most of them as we are in the process of getting a new platform. Our IT has advised they can create an EEX file -- how can we then recreate a folder of .DIS files should we need workbooks on an individual basis for end users?

    Arf,
    I have a few questions that you may be able to assist me...
    My backups and archive logs go to ASM +FRA
    On the script I have:
    SPOOL LOG TO $location/rman.log
    1. I can't find this log files; I am not sure if it is in FRA or how to find it.
    CROSSCHECK ARCHIVELOG ALL;
    DELETE NOPROMPT EXPIRED ARCHIVELOG ALL;
    DELETE NOPROMPT ARCHIVELOG ALL COMPLETED BEFORE 'SYSDATE-7';
    DELETE NOPROMPT OBSOLETE;
    SQL 'ALTER SYSTEM CHECKPOINT';
    SQL 'ALTER SYSTEM SWITCH LOGFILE';
    BACKUP DEVICE TYPE DISK TAG '%TAG' ARCHIVELOG ALL NOT BACKED UP;
    BACKUP VALIDATE DATABASE ARCHIVELOG ALL;
    RELEASE CHANNEL;
    SPOOL LOG OFF
    EXIT;
    2. You recommended to DELETE ALL INPUT after the archivelogs have been backup. Also the recommendation was to backup the archivelogs NOT BACKED UP 1 TIMES; That means that we will have only one copy for recovery. Am I wrong? My company does not have a tape system; the backups and archive logs are saved on disk only. I thought that 10 days will be sufficient, but I am getting hundreds of archive logs because I have CONTROLFILE AUTOBACKUP ON
    3. The retention Policy will be in effect for backups and archivelogs.
    I really appreciate your advice... Thanks, Terry

  • Direct inward System Access (DISA) Audio or script files ??

    Dear All,
    I was looking  for Simple-DISA-SBCS.zip files as I search on cisco website and support community forums but I am unable to find the files.
    It will be very helpful if anyone send the link to download the files or send the files.
    Model : UC560

    Dear All,
    I was looking  for Simple-DISA-SBCS.zip files as I search on cisco website and support community forums but I am unable to find the files.
    It will be very helpful if anyone send the link to download the files or send the files.
    Model : UC560

  • HT3986 I've had MS Office:mac 2011 on my imac for around 18 months now.  Outlook has just disappeared and when I find the file and open it it tells me that there is a problem and I may need to re-install it.  I've just done this using the installation dis

    I've had MS Office:mac 2011 on my imac for around 18 months now.  Outlook has just disappeared and when I find the file and open it it tells me that there is a problem and I may need to re-install it.  I've just done this using the installation disc which, then said the installation had been successful.
    Outlook is still not working.  Can anyone please advise me on what to do next.

    Remove MS Office 2011 completely (here are instructions) and reinstall it.
    It's not a simple or fast process but it is important to follow all of the steps in order to get all the files that Office scatters around. This will not affect your data files, only MS Office and its preferences.

Maybe you are looking for

  • Error setting the selected item in a dropdown component in a BPM

    Hi, The first step in the process is a human step where it's possible to select country in a dropdown box. The second step is also a human step where I try to display the same context that was selected in the first step. I use the "merge" operation i

  • Moving itunes

    hello, i am having major problems with my itunes. I am trying to move it to an external hard drive as i am running out of space on my imac. i am following the instrcutions from apple but each time all of my songs disappear from the itunes screen - ea

  • ACER with Vista not working with WRT54GS

    Recently changed from a D-link to a WRT54GS and now my ACER laptop does not seem to connect properly.  The network and internet indicate they are fully connected, but I am unable to achieve results through Explorer - indicates it cannot connect to th

  • Ideapad U310 Windows 7 64bit, YouCam not working

    I have a 1.5yr old Ideapad U310, Win 7 64 bit with inbuilt webcam. The camera is not working and cant be swiched on. Neither thru YouCam software or through Skype. Was weoring perfectly a month ago. The Product shown in the the Device manager under I

  • Can The iPhone 4s rattel damage the lenses??

    I know that it is normal for the iPhone 4s  Camara to rattel. My question is us this does not damage the lenses inside or are they protected Thanks