Last.fm - manually generate & upload statistics?

Hi!
There are quite a few tools in the repos that seem to offer support to upload statistical data to a last.fm account while plaing music located in the filesystem, but...
- Does anyone know, how that's transmitted? Like how the "file" looks or what protocol/methods are available?
- How to manually upload "artist - title - played"?
- How to have whole directories of mp3's uploaded as "played" (or text files / lists)?
- Where to start figuring all that out?
( To put that into a context: I'm trying to improve my recommended channel here. )
Thanks!

Thanks, so far so good...
write header:
echo "#AUDIOSCROBBLER/1.1\n#TZ/TC\n#CLIENT/<lastmsg0.1>" > .scrobbler.log
add lines for all mp3 files in directory / subdirectory:
while read file; do mp3info -p "%a\t%l\t%t\t%n\t%s\tL\t$TIMESTAMP\t$ID\t\n" "$file" >> .scrobbler.log; done < <(find ./ -name '*mp3')
... now:
1) Strange: the second line does work from command line, but not inside a bash script. I'm totally new to writing/using scripts, so maybe to anyone else it's obvious why it says: " genera.sh: line 4: syntax error near unexpected token `\<' "?
2) I'm wondering, if small differences in play time (seconds) have a big impact on recommendations quality (maybe half of the files result in an own entry noone else shares as played/loved because of +- 3-5 seconds)? Does lastfm correct that or am I supposed to?
3) Also the timestamp... is this used for anything or just "eyecandy"? I'm thinking of maybe just setting it to generate timestamps with a delay of random 22-33 minutes between each "play" backwards from the actual timestamp or something... something?
4) Less of a problem I guess, but non the less: Is there an easy AND basic approach to upload these things? Like with some post/http/ftp/hatever tool? Didn't look into that yet, but "qtscrob" sounds like "GUI for KDE and 100 dependencies included" and "Perl script to submit these" sounds like "a quick look into that will make me accidentally learn the basics of perl for the next 10 hours"...
5) Unimportant: This "MusicBrainz Track ID" actually IS unimportant, isn't it? Or is there an easy way to get it? (like a Program that can optionally put it into the id3 tag of all mp3's or a way to generate it from info that's already in there)
Last edited by whoops (2009-06-04 16:28:15)

Similar Messages

  • Creteria for generating optimizer statistics

    Dear membes,
    Oracle 10.2.0.2
    os : HP-UX 11i
    I want to know is there any creteria for generating optimizer statistics
    for the production database, like we have a creterai for rebuilding indexes
    through checking the index_stat table.
    Is there any tables present in database for this kind pre checking before
    generating the optimizer statistics.
    Thanks.

    VIRENDER SINGH wrote:
    Dear membes,
    Oracle 10.2.0.2
    os : HP-UX 11i
    I want to know is there any creteria for generating optimizer statistics
    for the production database, like we have a creterai for rebuilding indexes
    through checking the index_stat table.
    Is there any tables present in database for this kind pre checking before
    generating the optimizer statistics.First of all I hope you've heard that rebuilding indexes is something that should only be done under very rare circumstances, see e.g. here or here.
    Since you're on 10g already you should be aware of the fact that the database is coming with a pre-configured default statistics collection job that runs every night and attempts to gather statistics on stale objects (either with no statistics at all or more than 10% changes since the last statistics gathering).
    If you're having an application that modifies large chunks of data in batch jobs, then you should consider gathering statistics right after these batch modifications because otherwise it might take too long until the default job is activated and refreshes the statistics and many execution plans until then might suffer from the incorrect/outdated statistics.
    Ideally you should know your application and data and work out when to gather the statistics and in particular how to gather the statistics and in some cases even use hand-crafted statistics (using the DBMS_STATS.SET__STATS procedures) that lead the optimizer in the desired direction.
    There are many potential pitfalls to consider, e.g. the default method for generating histograms has changed in 10g to "FOR ALL COLUMNS SIZE AUTO" which means you'll get histograms for columns that are used in WHERE clauses (when using equal or range comparisons) and have skewed data or contain large gaps in the values. This was not the case in 9i and not always does the existence of histograms change execution plans for the better, in particular if you're using bind variables.
    In addition 10g introduced the possibility of "out-of-range" predicates which means that the optimizer takes into account if a predicate value is outside the recorded minimum and maximum value of the column. If the gap is becoming significant the selectivity is adjusted and eventually you get an estimate of 1 (or actually 0 adjusted to 1) rows which can have dramatic effects on your execution plans.
    This means you need to be careful if the default "staleness" of 10% changes is applicable to your particular situation otherwise you might be confronted with execution plans that either deteriorate over time and/or "switch" at a certain point in time to a really "bad" plan to due the adjusted selectivity caused by the "out-of-range" predicates. If this is the case you should consider refreshing the statistics more often (using your own logic/job) to avoid such situations.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • How to get custom file name in manually generated SSRS report

    Hi,
    I am creating a SSRS report which would be executed by User manually through ReportServer URL.User would be generating the SSRS report for different Customer ID based on ad-hoc basis.
    I am passing CustomerID as input parameter to the report. Is there any way to get the manually generated SSRS report name as 'Report_CustomerID_TodayDate.xls'. Please let me know.
    E.g.If User is generating report for Customer ID 123 today then report name should be 'Report_123_07092013.xls'
    Thanks for your help in advance.
    Thanks,
    Abhiyanta

    Hi Amar,
    If possible Can you please provide the custom code to rename the exported file in SSRS.
    Thanks in advance,
    Regards,
    Jagan

  • Script for generating VPN statistics

    Hi.
    I am sending accounting info to a RADIUS server that stores it in a mySQL database.
    Is there a script or an open source solution available to generate some statistics about the use of the VPN concentrator? Users per day/month/year, top users, traffic generated, and so on.
    Thanks.
    Ramada

    great work, dude.
    Thanks for sharing.

  • Manually generate SAP Note queue in SNOTE

    Hi!
    I would like to know if it is possible to manually generate a queue of SAP notes in transaction SNOTE. Instead of importing one note after the other, I would like to generate a queue.
    If this is possible, do you also know if SNOTE will determine the correct sequence of notes, and also if it will determine if there are any prerequisite notes for the notes I am importing?
    Thanks.
    Regards,
    Thomas

    Hi Olivier.
    I am very familiar with support packages and SPAM, but thanks for the information anyway.
    However, I don't see why being able to manually create short queues of notes is such a bad idea. After all, SNOTE itself automatically creates queues from time to time. Please see this link for information: [SAP Note Queue |http://help.sap.com/saphelp_nw70/helpdata/en/c3/0db13a95bed348e10000000a114084/frameset.htm]
    -Thomas

  • Why is iCloud asking for apple id password? I have started using Last Pass to generate passwords that are more secure. Ideas please?

    Why is iCloud asking for apple id password? I have started using Last Pass to generate passwords that are more secure. Ideas please?
    GCG

    The iCloud password isn't saved anywhere on your device, this is the way it's supposed to work.

  • How to manually generate events in SMC 3.5 ?

    Dear,
    We are using the SDK of SMC 3.5 to build an application. To test the application, how can we generate events manually ? I'd like to have a script on Solaris that, when run, generates an event in the SMC database. Does anyone have such a script ?
    Best regards,
    Marc.

    Convert the CR2 files to DNG files, a folder at a time, using the latest DNG Converter that is from Adobe:  http://www.adobe.com/downloads/updates/
    OR buy an upgrade to LR 4.x.

  • Only last 30 photos streaming uploaded to my PC

    Hi all,
    I've downloaded iCloud for PC hoping to have all the photos since my first iphone (like 2 years ago).
    Instead, I find, that my iCloud in my PC has only uploaded the last 30 photos. Why that? what happens with the rest?
    Thank you for your help!

    You cannot access Photo Stream from iCloud.com.
    Photo Stream is a sync tool, not a backup solution.
    You can get the last 30 days of photos from your Photo Stream by installing and signing into iCloud Contol Panel.
    To get your Whatsapp and Camera Roll data you will need to restore it from your backup to another iOS device.

  • ERP System ID is not appearing for Manual Replication upload in Properties

    Customer are uploading data for manual replication they are unable to select the ERP as the source system ID since it is not appearing in the drop down menu.

    Hi,
    prerequisite for selectiong a source system is that a system including a system instance is defined. If it is a SAP business suite system it must be defined as such. If not you need to select a non-SAP system in the properties of the uploaded file.
    Best regards, Reinhard

  • Multiple SELECTS or one Procedure to generate Count Statistics?

    I've got an application showing purchases from a catalog. I
    need to show statistics on those purchases -- the number of
    buyers by product by month, the number of buyers by source
    (space ads, direct mail, telephone, online) by month, the number
    of buyers by state by month, etc. In other words, there are a
    lot of row counts to generate.
    The question is, am I better off creating an SQL statement for
    each count request, or, since I have so many counts to generate,
    am I better off writing an SQL Procedure to loop through all the
    rows with a single table scan and update tables with the counts?
    An example count request -- counts by product code by month:
    SELECT TO_CHAR(purchdate, 'YYYYMM'), prodcode, count(*)
    FROM transactions
    GROUP BY TO_CHAR(purchase,'YYYYMM'), prodcode
    Oracle is pretty good at optimizing SQL, and it's certainly
    easier to focus on each SQL statement one at a time. But, it
    will require many full-table scans (and/or many indexes). The
    Procedure route is much more difficult to write in a general way
    (since I really have 30 or 40 different tables to do these
    statistics on with different columns to count in each table,
    with more coming in all the time).
    One more bit of info -- these counts are only generated after
    the file is updated. These updates are done monthly or
    quarterly, so these counts are not run all the time and are not
    generated on demand. We save them as PDF files for viewing.
    Any advice is greatly appreciated! :-)
    PS. I currently do this using Oracle Reports. This ends up as
    the "one SQL statement per count" method. But, I've just gotten
    a file in with 4 million records, and I don't want to wait until
    next year for the report to finish! :-)

    Dear Mr. Nagrajan,
    Thank you for your response. I have already gone through documents but not able to understand. Is there any setup for this ? or its just work around i.e. using template and special field in JV i.e. Ref. 1 /2
    My doubts :
    I understand that Chart of Account structure is one and common for IFRS and other accounting method. We need to create only those account separately ( 2 times with prefix like IFRS revenue account, GAAP Revenue account).
    Now at time of entry, Assume some entries / adjustment are specifically for IFRS and not for other ledger. In this case, What need to do ?
    You have mentioned about DTW approach but do we need to insert all JV's again with other ledger ?
    Someone suggested that if any entry which are specific to IFRS Ledger, We need to user Ref.1 /2 column or Transcation code column and in which we can put IFRS
    Based on this, Need to create 2 seperate template for IFRS and other ledger for all report.
    This is my understanding of Solution in SAP B1. Please help me to clarify my though process
    Please do needful.If you have done implemenation and if you can share doucment, it would be great help.
    Email :[email protected]

  • Display First Name and Last Name of generating user in Siebel BIP Report

    Hello,
    My client has a requirement to display First Name and Last Name of user who generated the report. I need some help to resolve this requirement. Thanks
    Eg: Generated by <First Name> <Last Name>
    Generated by Siebel Administrator
    Regards,
    Hari Venkat.

    Hey Rob,
    Is this search help something that you have developed?  Can you explain a little more to how the funcitonality works?  Is this triggering an operation in your BAdI?
    Cheers,
    Kevin

  • How to generate Oracle Statistics

    Hi Team,
    I want to execute 18 tables into the production database to generate statistics.
    I have executed below command into the local database for testing. Query searched in Google it self.
    EXEC dbms_stats.gather_table_stats(‘SCOTT’,’EMP’,cascade=>TRUE);
    This query executed but output not display on console. i.e on Screen.
    Please suggest me any one, Is it possible to execute 18 tables in a single command? and How to get this output.
    Thanks & Regards,
    Venkat

    Venkat wrote:
    Hi Team,
    I want to execute 18 tables into the production database to generate statistics.
    I have executed below command into the local database for testing. Query searched in Google it self.
    EXEC dbms_stats.gather_table_stats(‘SCOTT’,’EMP’,cascade=>TRUE);
    This query executed but output not display on console. i.e on Screen.
    Please suggest me any one, Is it possible to execute 18 tables in a single command? and How to get this output.
    There won't be any output for the package's execution as it would update the database dictionary only. So check and compare the dba_tables , dba_col_statistics etc view before and after running this package.
    HTH
    Aman....

  • Manual of Upload Deltau00B4s

    Hi
    Somebody knows about of manual for the loads deltas
    and their explanation
    Regard
    R.A.

    Hi
    Somebody knows about of manual for the loads deltas
    and their explanation
    Regard
    R.A.

  • Collection and manual file upload

    okay, I have a collection. With this collection there can be at least one row or there could be many. Some of the fields are user updateable. So pretty much I have a tabular form on a collection. I have all my page processes to handle the table and perform DML the table correctly. I now have a requirement to be able to upload a file for each row. Therefore what I did was this....
    ....,'<input name="f18" type="file"/>' as "Disposal Document"....in my select list for my collection. FYI it is PL/SQL returning SQL.
    Then in the page processing when I "save" off the collection using the collection API i saved G_F18(i) in clob001. Then just did an insert statement into my table based on the collection. Cool...nothing too hard.
    Now to download an image on another page that calls this javascript, just so the document opens in a new window...
    function documentpoup(disposal_id){
    window.open('/pls/apex/#OWNER#.download_file?pid='+disposal_id,'Attachment','width=800,height=600,toolbar=yes,location=yes,directories=yes,status=yes,menubar=yes,scrollbars=yes,copyhistory=yes,resizable=yes');
    </script>now my procedure converts my clob into a blob to download it...
      l_mime_type                varchar2(255); 
      l_length                number; 
      l_content_file_name           varchar2(2000); 
      l_content_file           blob;
      myclob clob;
    l_dest_offset   integer := 1;
    l_source_offset integer := 1;
    l_lang_context  integer := DBMS_LOB.DEFAULT_LANG_CTX;
    l_warning       integer := DBMS_LOB.WARN_INCONVERTIBLE_CHAR;
    begin
      select null, disposal_document, item_number||'-'|| org_code ,dbms_lob.getlength(disposal_document)
        into l_mime_type, myclob, l_content_file_name, l_length 
        from material_disposal
        where disposal_id = pid;
      DBMS_LOB.CREATETEMPORARY(l_content_file, TRUE);
      DBMS_LOB.CONVERTTOBLOB
       dest_lob    =>l_content_file,
       src_clob    =>myclob,
       amount      =>DBMS_LOB.LOBMAXSIZE,
       dest_offset =>l_dest_offset,
       src_offset  =>l_source_offset,
       blob_csid   =>DBMS_LOB.DEFAULT_CSID,
       lang_context=>l_lang_context,
       warning     =>l_warning
      owa_util.mime_header(nvl(l_mime_type,'application/msword'), FALSE ); 
      htp.p('Content-length: ' || l_length); 
      htp.p('Content-Disposition: inline; filename="' || l_content_file_name || '"');
      owa_util.http_header_close; 
      wpg_docload.download_file(l_content_file);  When the file downloads all I have in my "docuemnt" is the name of the file...It is like the file is not actually being uploaded. So something is wrong.... :(
    Some more info. The reason I am doing a CLOB and not just BLOB is becasue of the collection. If I could figure out a way not to mess with the CLOB and only use the BLOB that would be cool, but since I am not using an apex item and I am using just html...for some reason the files will not go into the wwv_flow_files table. I have done a dummy page to test this and I can't get it to work...i.e. if I have an apex item that is a file browse type then the file is uploaded, but if I have an html region with an input type of file then that file is not uploaded....has anyone seen or had to do anything like this???or could anyone point me in the right direction?!?
    Thanks,
    David

    does anyone have any ideas?
    -David

  • My Canon Mark III is set to RAW files and they are uploading as jpgs.. Anyone know why? Im shooting in manual and uploading into iphoto..thanks!

    I am setting my camera to RAW and its only loading Jpgs.. I am using a SD card.. coupld that be why? Shoudl I use CF card instead?

    No - if you are shooting RAW then that is what is imported to iPhoto - when you import RAW iPhoto makes a JPEG preview of it for use by other programs
    Why do you think iPhoto does nto have the RAW?
    LN

Maybe you are looking for

  • Performance end to end testing and comparison between MPLS VPN and VPLS VPN

    Hi, I am student of MSc Network Security and as for my project which is " Comparison between MPLS L3 VPN and VPLS VPN, performance monitoring by end to end testing " I have heard a lot of buzz about VPLS as becoming NGN, I wanted to exppore that and

  • After updating Firefox I am getting this message any time I do anything "Exc in ev handl:Error: Bad NPObject as private data"

    This error happens each time I use Mozilla. especially when I click on an extension or try to work in any of my e-mail accounts (Yahoo, Gmail, or RCN). Sometimes I am able to continue by just clicking OK and other times it prevents me from completing

  • Dual Boot Predicament

    Hello all. I have found myself in a pickle. I have a 2011 MacBook Air, 64GB Flash Storage, 2GB Ram, running OS X 10.8.3 Mountain Lion. I would really like to be able to have windows on my mac for certain things (Trust me,OS X is 1000000% better than

  • Ms access exporter error

    Hi, I have the Ms access table .When i tried to migrate i got the error. First i created connection with oracle. Second i tried to export the data from ms access . I got the following error. Error #5 -XML Exporter Invalid procedure call or argument c

  • What exactly is in "BI java" ?

    Hello We are trying to understand what is included in "BI java" and what will not work without it. I must say that the technical guide or the official presentation are unclear. As of now, we understand that - we must run Java in order to use Adobe PD