Lightweight library for reading and writing mp3-tags

Hi,
I'm currently evaluating libraries for reading and writing mp3-tags. All the projects I found so far where not maintained for a couple of years.
My question:
Is there currently a library, which one could call standard for this purpose
Cheers
Jonny

jonnybecker wrote:
I'm currently evaluating libraries for reading and writing mp3-tags. Don't you rather mean ID3 tags?
All the projects I found so far where not maintained for a couple of years.It may either be dead, or it may be so finished and free of bugs that no maintenance is needed anymore.
If you're occurring problems with it and if it is open source you can always consider taking a fork for own development.
Is there currently a library, which one could call standard for this purposeNo one comes to mind.

Similar Messages

  • I updated Itunes today to the latest version. Windows 7 64bit. None of my drivers work and get an error when itunes starts, about registry setting for reading and writing dvds and cds missing. Anyone else have the same issue. I downloaded itunes again, re

    I updated Itunes today to the latest version. Windows 7 64bit. None of my drivers work and get an error when itunes starts, about registry setting for reading and writing dvds and cds missing. Anyone else have the same issue. I downloaded itunes again, reinstalled still have same issue.

    I'd start with the following document, with one modification. At step 12 after typing GEARAspiWDM press the Enter/Return key once prior to clicking OK. (Pressing Return adds a carriage return in the field and is important.)
    iTunes for Windows: "Registry settings" warning when opening iTunes

  • Quickest method for reading and writing files

    Hi
    I need help regarding file operations.(Reading and Writing). Currently I am using BufferedReader and BufferedWriter to read and write files. But the files (XML) are very huge files(from 30 -50 mb). This is slowing the application to a great extent. Is there any other approach to perform the above mentioned operations on XML files in a fast manner.
    Thank You
    Mansoor.

    Hi
    Can u let me know how to use the java.nio pavkage for primitve data types(int,float..., boolean). I have tried it but found no success.
    Thank You
    Mansoor

  • The VI of I2C driver for reading and writing registers was functional in 6.1 but that same VI is not working for 7.0 & 7.1 version. WinIO is also used in generating the dll as OS is Windows2000. What could be the problem?

    THIS IS THE CPP FOR READ. sIMILAR vi WAS MADE FOR WRITE
    Attachments:
    I2C_Read.cpp ‏7 KB
    I2C_Read.h ‏2 KB
    I2C_Exe.h ‏1 KB

    HI,
    I'm not familiary wiht the WinIO functio that you are using here but there are a couple of thisngs to watch out for when doing low-level access.
    The most common problem with low level IO functions is that if they were created for Win9x, they will not work under Win NT/2000/XP. NT based OSes require you to have kernel access to do any low level communication. Make sure that the WinIO library that you are using is aupported under Win2K.
    Since you have the code for the dll available, you could attach LabVIEW as an external process in your C compiler and debug the dll; this can helps you track down the exact function that is failing.
    Just my 2 cents.
    Regards,
    Juan Carlos
    N.I.

  • What is the maximum size the of files we can use for reading and writing

    Hi
    i have attached a vi which i use to read and write reports.
    My system would run continuosly for more than 2 days, typically i expect a file size of 30MB per day. I keep on appending the data in the file.
    The VI everytime opens the file appends it and closes.
    Does this affect the performance as the file size increases?
    What is the effective way to do it?
    Is there any file size restriction in windows?
    Attachments:
    Read_or_Write_Current_Cycle's_Data.zip ‏27 KB

    First: The performance by increasing file size depends from the file system (FAT32 or NTFS) you are using. NTFS will be better if the file size increases. Also the size of the partition will have an effect on this.
    Second: The effective way will be not to close and reopen the file. Use "File I/O > Advanced File Functions > Flush File" to write the file to the system instead of File Close and File Open. Flush File will force the file system driver to write the file to disk and update the file information on disk. On NT based systems this will be done with a latency time of 4 seconds. This is even true if you close the file and the time cannot be changed.
    Third: There are file restriction to the lenght which do not depend on Windows, they depend on
    the file system. FAT32 can handle files until 4 GByte (2^32) and NTFS can handle 2^64 Bytes. Since LV is platform independent the LV file functions can handle only 4 GByte. The offset parameter in the file functions has a I32 data type.
    Waldemar
    Waldemar
    Using 7.1.1, 8.5.1, 8.6.1, 2009 on XP and RT
    Don't forget to give Kudos to good answers and/or questions

  • Macbook pro for reading and writing NTFS for my external hard drive

    Hello,
    I have the new Macbook pro with retina display and I tried transfering some of the files on my macbook to my external hard drive (WD My Passport). Of course, it didn't let me. I did some research and found out that Mac only "reads" NTFS systems, but doesn't write. I did some more research to see if there was a way around this. There are several options and I need help with deciding which option is best.
    1.  Formatting my external hard drive to the FAT32 file system - however, it says that there are limitations with writing file sizes larger than 4GB. Is that true?
    2.  Formatting my external hard drive to the exFAT file system - is this a better option?
    3.  Downloading MacFuse - is it really discontinued?
    4.  Downloading OSXFUSE - is this better than MacFuse?
    5.  Downloading fuse4x - I don't know about this one
    Aside from all this individual questions listed, which option is the best?? And, which option will give me less damage or data loss, and which is very reliable?? All I want to do is be able to transfer my documents or files on my MAC to my hard drive, but be able to read/write files on any other PC. I bring my hard drive everywhere and I always need to access my files with any computer I see (like in a computer lab at school/library). 
    If you can answer all of my questions, YOU'RE AWESOME!
    Thank you!

    tin2x58 wrote:
    1.  Formatting my external hard drive to the FAT32 file system - however, it says that there are limitations with writing file sizes larger than 4GB. Is that true?
    Yes, exFAT is better.
    2.  Formatting my external hard drive to the exFAT file system - is this a better option?
    Yes, you won't need any third party software, one less hassle.
    3.  Downloading MacFuse - is it really discontinued?
    4.  Downloading OSXFUSE - is this better than MacFuse?
    5.  Downloading fuse4x - I don't know about this one
    Don't bother.
    Aside from all this individual questions listed, which option is the best?? And, which option will give me less damage or data loss, and which is very reliable?? All I want to do is be able to transfer my documents or files on my MAC to my hard drive, but be able to read/write files on any other PC. I bring my hard drive everywhere and I always need to access my files with any computer I see (like in a computer lab at school/library). 
    Take a copy of all the data off the drive. Formatting the drive will erase all existing data.
    Take the drive to the oldest Windows OS your going to use it with, if Windows XP there is a free exFAT update from Microsoft, install it first, then right click on the drive and format exFAT, it will create the appropriate partition table (either GUID or MBR) automatically.
    Take the drive to the Mac, it will read it just fine whatever it is,.
    Drives, partitions, formatting w/Mac's + PC's

  • Reading and Writing large Excel file using JExcel API

    hi,
    I am using JExcelAPI for reading and writing excel file. My problem is when I read file with 10000 records and 95 columns (file size about 14MB), I got out of memory error and application is crashed. Can anyone tell me is there any way that I can read large file using JExcelAPI throug streams or in any other way. Jakarta POI is also showing this behaviour.
    Thanks and advance

    Sorry when out of memory error is occurred no stack trace is printed as application is crashed. But I will quote some lines taken from JProfiler where this problem is occurred:
              reader = new FileInputStream(new File(filePath));
              workbook = Workbook.getWorkbook(reader);
              *sheeet = workbook.getSheet(0);* // here out of memory error is occured
               JProfiler tree:
    jxl.Workbook.getWorkBook
          jxl.read.biff.File 
                 jxl.read.biff.CompoundFile.getStream
                       jxl.read.biff.CompoundFile.getBigBlockStream Thanks

  • Reading and writing a ByteArray

    This question was posted in response to the following article: http://help.adobe.com/en_US/as3/dev/WS5b3ccc516d4fbf351e63e3d118666ade46-7d54.html

    I believe that the example of Reading and Writing Objects needs a rewrite
    First, the attempt to use comments to interleave a Flex-ified example with a Flash example of the uses of the ByteArray Class is a shortcut that ultimately fails.  Take the time to create two, separately valid examples for the two different frameworks.
    Second, the interlinea comments show a rather poor example of how to write clean code, and are entirely confusing to the person you you trying to help learn how to use your library.  Look at the declaration of a variable called outFile:File, a comment about some function that is to be declared somewhere with the same name, outFile(), and indeed, if this code were to compile -- which I am sure it could not -- the compiler would see that you try to invoke an outFile() function.  No one gets through an interview who writes code that way.
    Third, I am not sure you understand how the readBytes and writeBytes methods work. In the upper example, where an order is being written to the file system, the writeBytes method uses data.length correctly.  At the point in time that it is used, the XML object will have been written into the bytes:ByteArray, so it will have a proper length.
    On the other hand, in the second half of the example, where the readBytes method similarly passes data.length, the reader has to be very lucky to note that at that point in time data.length will have a value of zero.  The inBytes:ByteArray was only instantiated, not populated with anything, as, indeed it should not be, at that point in the logic flow.  Most importantly, when the program does need to readBytes, there is no way for the program to know in advance what the length of the byte stream coming in from the file system will be.  Your example would be more correct, and could make a useful point in showing that readBytes (data) is the correct way to pass the arguments because the default behavior will then be used and the default behavior will be to make data whatever length it needs to be in order to hold the content from the file system.  The program can know the value of data.length after the read has taken place, but not before.

  • Problem on reading and writing from from a *.txt file

    I get Problem on reading and writing from from a *.txt file. The following is the read() method...
    The software said the DataInputStream is depreciated. Can anyone help me please?
    public void read()
        File file = new File("C://Documents and Settings//Charles//My Documents//Brunel//EE2065//Assignment and Lab//Assignment 4 and Lab 4//data.txt");
        FileInputStream in = null;
        String str = "";
        try
          in = new BufferedReader(file);
          //in = new FileInputStream(file);
          for(;;)
            str = new BufferedReader(in).readLine();
            //str = new DataInputStream(in).readLine();
            if(str == null)
              break;
            System.out.print(str);
        in.close();
        catch(IOException e)
            System.err.println("execution error: " +e);
      }

    Thank you for your reply. I have made some change. However, there is an incompetable type found error.
    in = new BufferedReader(new InputStreamReader(in));The following are all of the code.
    public void read()
        File file = new File("C://Documents and Settings//Charles//My Documents//Brunel//EE2065//Assignment and Lab//Assignment 4 and Lab 4//data.txt");
        FileInputStream in = null;
        //BufferedReader in = null;
        String str = "";
        try
          in = new BufferedReader(new InputStreamReader(in));
          //in = new FileInputStream(file);
          for(;;)
            BufferedReader Bstr = new BufferedReader(new InputStreamReader(in));
            //str = new BufferedReader(in).readLine();
            //str = new DataInputStream(in).readLine();
            if(str == null)
              break;
            System.out.print(str);
        in.close();
        catch(IOException e)
            System.err.println("execution error: " +e);

  • Switcher - Reading and Writing External IEEE HDD's

    Greetings,
    i'm trying to read and write with my external ieee drive on windows and the mac pro. it's format is ntfs, and is read only on the mac pro.
    i am admin and can't seem to change the permissions to read / write.
    the over all idea is i need to go back and forth between operating systems reading and writing from both sides of the house.
    here's a snapshot.
    http://briankross.com/images/500GBIEEEHDD.jpg
    Mac Pro   Mac OS X (10.4.8)  

    Unfortunately, no. In order to convert the drive to FAT32 it must be re-partitioned which is a destructive process.
    Why reward points?(Quoted from Discussions Terms of Use.)
    The reward system helps to increase community participation. When a community member gives you (or another member) a reward for providing helpful advice or a solution to their question, your accumulated points will increase your status level within the community.
    Members may reward you with 5 points if they deem that your reply is helpful and 10 points if you post a solution to their issue. Likewise, when you mark a reply as Helpful or Solved in your own created topic, you will be awarding the respondent with the same point values.

  • Very slow painting while reading and writing doubles into file

    for 15MB length file i = 7662080
    for 50MB length file i = 12414368
    Part of Code for writing into file follows like this:
    try{
    fos = new FileOutputStream("Angel.txt");
    File f = new File("Angel.txt");
         if(f.length() >=4)
         f.delete();
    fos = new FileOutputStream("Angel.txt");     
    dos = new DataOutputStream(new BufferedOutputStream(fos,1000000));
    int x=0;
    double y_last, y_new;
    for(int j=0 ;j<i ;j++)
    if(some condition)
    y_new = ....;
    try{
    //previously in vectors
    y_last = y_new;
    vect.add(new Line2D.Double(x, y_last, x, y_new)_;
    dos.writeDouble(y_new);
         }catch(Exception e){System.out.println(e);}
    dos.close();
    fos.close();
    x++;
    }catch(Exception excp){System.out.println(excp);}
    part of code for reading from file follows like this:
    public void paint(Graphics g)
    try{
         double y1, y2 =0;               
         Line2D.Double doub;
         raf = new RandomAccessFile("Angel.txt","r");
         dis = new DataInputStream(new BufferedInputStream(new FileInputStream(raf.getFD(),1000000)));
         raf.seek((rect.x*8));
         for (int i = 0/any value; (i < value as per choice); i++)
              g2.setStroke(new BasicStroke(0)); //2
              y1 = y2;
         y2 =dis.readDouble();
              doub=new Line2D.Double(i,y1,i,y2);
              g2.draw(doub);
    dis.close();
         raf.close();
    }catch(Exception excp){System.out.println(excp);}
    I tried using Object Streams but NotSerializable Exception is thrown as Line2D.Double objects
    are not serialized.
    Any idea to make reading and writing into file specially from MB files faster is appreciated.

    Why are you reading in the file in the paint method ?
    Create your data once before painting.
    I think you should explain what is your goal and what behavior you want.
    Denis

  • File importer detected an inconsistency in the file stucture of (file name). Reading and writing this file's metadata (XMP) has been disabled.

    I have duplicated a project to work on another computer. The project opens fine but when I import new footage/audio files I get this message. "File importer detected an inconsistency in the file stucture of (file name). Reading and writing this file's metadata (XMP) has been disabled." Then I can't play my timeline and Premier Pro crashes.  I have to force quit and restart my computer to continue working. What does this error really mean? How do you rectify?
    Our workflow requires that we duplicate projects to make updates because we are frequently revising but need to keep original project unchanged and intact.

    I have a similar issue and message , but occurs when I import AVI clips from OnLocation CS4 to Premier Pro CS4.

  • Reading and Writing to blob column is very slow

    Hi
    I want to write a serialized java object called 'engine' to a database table called javaObjectsDB. The serialized object could be of sizes anywhere between 20MB to 4GB. I am using the following code to write this 'engine' object
                   * Write java object to BLOB in DB - Start
                   Connection conn=null;
                   conn = Util.getConnectionFromDS();
              PreparedStatement ps=null;
              String sql=null;
                   Engine.startWatch();
                   ByteArrayOutputStream baos = new ByteArrayOutputStream();
                   Engine.endWatch("Time for ByteArrayOutputStream");
                   Engine.startWatch();
                   ObjectOutputStream oos = new ObjectOutputStream(baos);
                   Engine.endWatch("Time for ObjectOutputStream");
                   Engine.startWatch();
                   oos.writeObject(this.engine_);
                   Engine.endWatch("Time for oos.writeObject");
                   oos.flush();
                   Engine.startWatch();
                   byte[] data = baos.toByteArray(); // in case of using ByteArrayOutputStream
                   Engine.endWatch("Time for toByteArray");
                   Engine.startWatch();
         sql="insert into javaObjectsDB values (?)";
         ps=conn.prepareStatement(sql);
         ps.setObject(1, data); // in case of using ByteArrayOutputStream
         ps.executeUpdate();
         Engine.endWatch("Time for prepare statement and insert");
                   * Write java object to BLOB in DB - End
    To read from database BLOB column back to the engine object I use the following code:
    try {
                   String readBlob = "SELECT javaObject FROM javaObjectsDB ";
                   Statement readBlobStmt = conn.createStatement();
                   ResultSet resultSet = readBlobStmt.executeQuery(readBlob);
                   while (resultSet.next()) {
                        * ByteArrayStream *** Start
                        Engine.startWatch();
                        Blob blob = resultSet.getBlob(1);
                        Engine.endWatch("loadPlan::Time for getBlob");
                        Engine.startWatch();               
                        byte [] data = blob.getBytes(1,(int)blob.length());
                        Engine.endWatch("loadPlan::Time for getBytes");                    
                        Engine.startWatch();
                   ByteArrayInputStream bais = new ByteArrayInputStream(data);               
                   Engine.endWatch("loadPlan::Time for ByteArrayInputStream");
                   Engine.startWatch();
                   ObjectInputStream oin = new ObjectInputStream(bais);
                   Engine.endWatch("loadPlan::Time for ObjectInputStream");
                   Engine.startWatch();
                   engine_ = (Engine)oin.readObject();     
                   Engine.endWatch("loadPlan::Time for oin.readObject");
                   System.out.println("engine object prepared");
                        * ByteArrayStream *** End
                   } //while of result Set
    The time taken to write and read for a engine object of size 124MB are:
    write - 41secs
    read - 28secs
    For a engine object of size 340MB
    write - 3minutes
    read - 1minute
    This is a lot of time for such small object size, and since we are expecting sizes of 4GB, I need to have better performance. Please suggest what I can do to improve performance for read and write.
    One thing I tried was using cached lobs but not of much use.

    Welcome to the forum!
    Unfortunately this is an Oracle forum and your question should be posted in the Java JDBC forum.
    https://forums.oracle.com/forums/category.jspa?categoryID=288
    Please create a question on the JDBC forum, post a link to that new question here, in this forum, and mark this question answered.
    That way anyone seeing this question can followup with you in the other forum.
    Thanks.

  • Creating a external content type for Read and Update data from two tables in sqlserver using sharepoint designer

    Hi
    how to create a external content type for  Read and Update data from two tables in  sqlserver using sharepoint designer 2010
    i created a bcs service using centraladministration site
    i have two tables in sqlserver
    1)Employee
    -empno
    -firstname
    -lastname
    2)EmpDepartment
    -empno
    -deptno
    -location
    i want to just create a list to display employee details from two tables
    empid firstname deptno location
    and same time update  in two tables
    adil

    When I try to create an external content type based on a view (AdventureWorks2012.vSalesPerson) - I can display the data in an external list.  When I attempt to edit it, I get an error:
    External List fails when attached to a SQL view        
    Sorry, something went wrong
    Failed to update a list item for this external list based on the Entity (External Content Type) 'SalesForce' in EntityNamespace 'http://xxxxxxxx'. Details: The query against the database caused an error.
    I can edit the view in SQL Manager, so it seems strange that it fails.
    Any advice would be greatly GREATLY appreciated. 
    Thanks,
    Randy

  • IPhone4 how through the USB mobile phone file reading and writing?

    iPhone4 how through the USB mobile phone file reading and writing?

    No idea what you are asking.
    Please explain

Maybe you are looking for