How to unblock Sales orders(More than 1000 sales orders)

Hi Gurus,
We are facing the problem that because of weight issue-1000 sales orders are blocked.One authorised perso is ther to unblock it.But has all 1000 orders in his mail inbox.
To un block all orders it will take much time.
Is it there any T.Code to un block all order at a time.
Valuable answer rewarded.
Regards
rama

Dear Rama
Go to TCode MASS, input the Object Type BUS2032  and execute.  There block "Sales order header data"  and click on "Fields".   Here you can see two fields, viz.MASSSDHEAD_S-LIFSK (for Delivery block) and MASSSDHEAD_S-FAKSK (for billing block).  Choose whichever is blocked and execute.  Again execute.  Now you can see two tabs; one "New Values"  and below you can see those sale orders.  Maintain new value on top and execute.
thanks
G. Lakshmipathi

Similar Messages

  • How can i get more than 1000 items in Custom List Displaying Items?

    http://.....sites/_vti_bin/listdata.svc/AddressBook(List) allows 1000 items Only my AddressBook List, but i have more than 1000 items in my AddressBook list. I want to get items
    from AddressBook list and bind in another place using Autocomplete method like this.
    =======>>>>listurl=http://.....sites/_vti_bin/listdata.svc/AddressBook.
    and my coding in camel like below
    protected void btnpopulatedetails_Click(object sender, EventArgs e)
    SPSite objSite = SPContext.Current.Site;
    SPWeb objWeb = objSite.OpenWeb();
    objWeb.AllowUnsafeUpdates = true;
    SPList list = objWeb.Lists["Address Book"];
    SPQuery query = new SPQuery();
    query.QueryThrottleMode = SPQueryThrottleOption.Override;
    query.Query = "<Where><Eq><FieldRef Name='Title'/><Value Type='Text'>" + txtcustomer.Value + "</Value></Eq></Where>";
    query.RowLimit = 500;
    SPListItemCollection items = list.GetItems(query);
    function fnCustomerSearchBind() {
    if ($("input[id$='txtcustomer']").length != 0) {
    var collcustomer = searchCustomers('', urlTo, fieldto);
    $("input[id$='txtcustomer']").autocomplete({
    source: collcustomer,
    //maxLength:10,
    select: function (event, ui) {
    //debugger;
    event.preventDefault();
    $("input[id$='txtcustomer']").val(ui.item.value);
    $("input[id$='btnpopulatedetails']").click();
    return false;
    minLength: 0,
    function searchCustomers(value, listurl, fieldto) {
    var collcus = new Array();
    var custresults;
    var url =listurl + "?$filter=startswith('" + fieldto + "','" + value + "')";
    //debugger;
    $.ajax({
    cache: true,
    type: "GET",
    async:false,
    dataType: "json",
    url: url,
    success: function (data) {
    custresults = data.d.results;
    //alert(listurl);
    // alert(data.d.results);
    for (var x = 0; x < custresults.length; x++) {
    //alert(custresults[x]['To']);
    collcustList.push(custresults[x]['To']);
    //debugger;
    //debugger;
    return collcustList;
    In my research listurl=http://.....sites/_vti_bin/listdata.svc/AddressBook has 1000 items in 1st Page by default, So how to change 1st Page by default to All Pages through
    code. I think you understand my environment by look above coding, if you don't I am using office 365 sandbox solution. One more thing i know through my research Pages can be change by using Ado.Net Data Service, i tried this one but this one support windows
    server 2008 r2 only but i am using Windows Server 2012 r2. Please help me if you know the answer.

    Hi,
    Here is a blog would be helpful:
    ADO.NET Data Services returns 1000 items
    https://gilleslauwers.wordpress.com/2010/12/08/ado-net-data-services-returns-1000-items/
    Best Regards,
    Dennis Guo
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • ORA-24335 - cannot support more than 1000 columns - How to solve this?

    Hi,
    I got error message 'ORA-24335 - cannot support more than 1000 columns ' when i try to insert x no of rows for a table with following code:
    INSERT ALL
    INTO tableA Values ('A', 'B', 'C', 1, 2, 3)
    INTO tableA Values ('D', 'E', 'F', 4, 5, 6)
    INTO tableA Values ('G', 'H', 'I', 7, 8, 9)
    SELECT *
    FROM DUAL
    How to solve above?
    What does it really mean? It's not as easy as if my table has 10 columns than I can't insert more than 100 rows (1000 columns divided with 10 columns = maximun 100 rows to insert), or?
    Is there a better/more appropriate way to do insert many rows?
    Should I do my inserts with an OracleTransaction (My application is developed with C# and asp.net), as ~follows
    BEGIN
    INSERT INTO tableA Values ('A', 'B', 'C', 1, 2, 3)
    INSERT INTO tableA Values ('D', 'E', 'F', 4, 5, 6)
    INSERT INTO tableA Values ('G', 'H', 'I', 7, 8, 9)
    COMMIT
    END
    Thx in advance!
    Edited by: user8819407 on 2010-mar-07 15:40

    Hi,
    So how did you solve the problem? Can you please give an example?
    I have the same problem inserting over 1000 values into my Oracle DB. The table only has 51 columns but to speed up the insert command, I use bulk inserts via insert all command. I need to insert 100 rows at a time for a total of 5100 values.
    Example:
    SQLCmd = INSERT ALL INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    SELECT 1 FROM DUAL
    SQLVals = 1 22C38299 80700334 04-19-2012 13:55:33 mm-dd-yyyy hh24:mi:ss MCC93000 04/19/2012 13:55:42 mm-dd-yyyy hh24:mi:ss 12 4 0.792191 23 1 0.00 -113.50 13.48 9 1 9
    1 36PR7038 8070EDC2 04-19-2012 12:24:35 mm-dd-yyyy hh24:mi:ss MCC60360 04/19/2012 12:24:41 mm-dd-yyyy hh24:mi:ss 7 4 0.757501 3 2 13.88 -114.80 14.06 5 1 6
    1 42C63512 8050166F 04-19-2012 16:02:50 mm-dd-yyyy hh24:mi:ss MCC52420 04/19/2012 16:02:57 mm-dd-yyyy hh24:mi:ss 10 4 0.778471 8 1 0.00 -122.30 13.05 8 1 7
    1 33MR3076 80803E75 04-19-2012 13:13:16 mm-dd-yyyy hh24:mi:ss MCC60330 04/19/2012 13:13:22 mm-dd-yyyy hh24:mi:ss 13 5 0.636721 28 3 0.00 -122.19 0.70 8 1 6
    Then I call: &DBInsert($sqlCmd, @sqlVals);
    This is the error I get: ORA-24335: cannot support more than 1000 columns.
    I need to insert about 879,500 rows (51 cols per row). The values I read from a text file in sets of about 48,000 and put them into a hash. I tried inserting one row at a time, but it takes too long, about 28 hrs! Then, I tried the bulk update with only 6 rows and the total time was about 25 minutes. Which is acceptable to us.
    Thanks!

  • How to pass more than 1000 entries in 'IN' clause, Oracle 11g

    Hi All,
    I know this is a very common question in Oracle discussion forum. But, Im in different zone.
    I use C#, .NET and Oracle 11g. I have a situation where I will create a query statement using 'IN' clause in C# code based on my requirement and execute that statement from code itself with oracle connection object. I do not have any procedures. I must phrase my query statement and pass it on to OracleConnection object to execute it.
    My code looks like this....
    List<decimal> x_Ids = new List<decimal>();
    I will load my IDs into x_Ids here;
    string whereInClause = ........I will prepare a 'IN' clause (All IDs separated by ',')
    My query would looks like this....
    string query = select * from MYTABLE where X_ID in [ whereInClause with more than 1000 entries]
    oraConn.ExecuteQuery(query);
    I have a workaround with OR operator with 'IN' clause like below.
    X_ID in [ Ids till 1000 entries] OR X_ID in [Next 1000 entries] OR X_ID in [Next 1000 entries] ....so on.....
    It is working, but, I heard that this may slowdown the performance of the application. Is this really a performance hit to my application?
    Can you please suggest any other workaround to overcome this situation?

    >
    I have a workaround with OR operator with 'IN' clause like below.
    X_ID in [ Ids till 1000 entries] OR X_ID in [Next 1000 entries] OR X_ID in [Next 1000 entries] ....so on.....
    It is working, but, I heard that this may slowdown the performance of the application. Is this really a performance hit to my application?There should be no performance difference between a statement like
    select * from myTab
    where ID in (1,2,3,4,5)
    OR ID in (6,7,8,10,12) and
    select * from myTab
    where ID in (1,2,3,4,5,6,7,8,10,12) The execution plan should be identical.
    However those values might better be send as a single object (collection or table of numbers type).
    I think the ODP or OO4O connectivity allows to create and such oracle object types.
    Another way could be to think about how all the values are created? Did any user enter them manually? Certainly not. Then maybe you can apply the same logic to the SQL statement that created those values in your .Net application.
    something like
    select * from myTab
    where ID in (select t2.FK_ID from otherTab t2 where t2.Col1 = 100) This approach would probably beat all others performancewise. Since you avoid the overhead of constructing the in-lists.

  • How to create more than 1000 entitie in FDQM

    Hi Gurus
    1.How to load locations to FDQM without manual Entry?
    Can we load locations by using flat files?
    Requirement: I have to create more than 1000 entities in FDM Application for mapping Target in HFM Application.
    How to achieve this?
    regards
    Dev

    Two things :
    #1 - The tables you need to add data to are : tPOVPartition, tStructPartitionHierarchy, tStructPartitionLinks. The first table is the basic location information. The last two tables define the relationship hierarchy of where the locations belong. I would expect that if the Strcut information is missing, FDM will not be able to place it anywhere on the locations screen.
    Additionally, for the tPOVPartition, there is required information and you need to make sure you are including all of that info when you create your record.
    #2 - If you are handy at coding, I'm positive there is an API call to create a location. It would make more sense perhaps to write a script that uses the APIs to create the locations so that you can be sure that it is being done right.

  • Assigning more than 1000 orders in one WBS

    Hi Experts,
    I have a requirement of assigning more than 1000 Orders to one WBS element. As per SAP note, a maximum of 1000 Orders can be asssigned to one WBS otherwise system performance may get affected. But as per the client requirement, either I should allow more than 1000 Orders or System should give an error message when the assignment crosses this limit... I checked for User exits (for Orders) and could not find any for this... any experience / suggestions ?
    Thanks
    Bala

    Hi,
    If more than 1000 orders has been assigned under one WBS then the system performance will get slow.Whenever you are retriving some data or reports for that WBS the system response will be slow.
    If the client requirement is like that then you can create more than 1000 orders,
    orelse you can break the WBS in SUb WBS & create many orders under different head.
    You can cretae more than 1000 orders under one WBS.The system wont give any error message.
    Regards,
    Raj

  • How to find tables which has more than 1000 Partitions

    Hi All,
    Is there any other way to find out the tables which has more than 1000 Partitions ?
    Apart from SAP report RSDD_MSSQL_CUBEANALYZE. Because this report is not working for me as job getting cancel again and again with ABAP dump DBIF_DSQL2_SQL.I already check SAP Note 1309838, but itu2019s not applicable for us because we are on highest support package level SAP_BW 701 SP06.
    Thanks,
    Harshal

    If you are running SQL Server as your database platform run this query:
    select o.object_id,o.name,p.Partition_count from sys.objects o
    inner join
    (select object_id,count(distinct partition_number) as Partition_count
    from sys.partitions
    group by object_id)
    p
    on o.object_id = p.object_id
    where o.type = 'U'
    order by p.Partition_count desc

  • Query challenge (IN ) not allow more than 1000 numbers

    I have this query in the "IN" condition i am passing more than 1000 numbers and it is showing me error "I can insert more than 1000 numbers in "in" condition" like
    number like 1,2,3,4,..... until 1000. I am getting error.
    How can i write this query as differently
    select a.barcode,a.othernumber,a.owner,b.booknum,b.context,c.dept,c.userid,
    c.firstname,c.lastname from objects a,bookitem b,users c
    where a.othernumber=b.booknum
    AND c.userid=a.owner(+)
    AND booknum in (#valuelist(search.booknum)#)
    order by barcode

    you could use a UNION
    e.g.
    select oid, dob from mytable
    where oid in ( mylist of first 1000 )
    UNION
    select oid, dob from mytable
    where oid in ( mylist of the next 1000 or less )
    I have this query in the "IN" condition i am passing more than 1000 numbers and it is showing me error "I can insert more than 1000 numbers in "in" condition" like
    number like 1,2,3,4,..... until 1000. I am getting error.
    How can i write this query as differently
    select a.barcode,a.othernumber,a.owner,b.booknum,b.context,c.dept,c.userid,
    c.firstname,c.lastname from objects a,bookitem b,users c
    where a.othernumber=b.booknum
    AND c.userid=a.owner(+)
    AND booknum in (#valuelist(search.booknum)#)
    order by barcode

  • Cannot Enter more than 1000 partners in Customer master data(Partner Funct)

    Dear Friends,
    In Customer Master Data  -> Sales Area Data -> Partner Functions-> I can able to enter 1000 partners.
    Now When I am trying to enter more than 1000 partners, system is not allowing me(no lines are available to enter partner)
    According to my requirement I have to enter another 500 partners(business requirement)
    Please suggest me how can I able to enter more than 1000 partner in customer master.
    Quick reply will be  appreciated and rewarded.
    Thank you

    Hello,
    It is very clear that, unless you change the Data element for KNVP-PARZA, the system will not allow you to maintain morethan 999 partner functions for a cusotmer master.
    This is very exceptional case and I handled the situation in my project with the Duplication of customer master data for Sold to party and maintained the remaining partner functions over there.
    Please check with  your Business/ Architect team whether they allow this. As an alternative, they have to go with duplication of sold to party for the maintanance of additional ship to parties.
    OR
    Should go for char length changes for data element KNVP-PARZA (NOT RECOMENDED).
    Hope it helps. Let me know if not.
    Thanks,
    Ram.

  • Error Creating Invoices with more than 1000 positions

    Hello.
    I am creating invoices in SD module with VF01 transaction.  When the document has more than 1000 positions an error is displayed.
    The text of the error is:
    Memory area p.status GUI SAPMV60A SM is too small.
    área memoria p.status GUI SAPMV60A SM demasiado pequeña.
    I would like to know if there is any way to correct this error. If it happens because memory is too small, how could it be increased?
    Thank you.
    Diana Carolina.

    Hi Diana
    As far as I know, standard SAP has restricted the no. of line items in an accounting document to 999.
    So, this is in turn impacts the no. of line items you can have on a Sales invoice,
    Since every invoice  line item can have atleast 2 accounting entries, the max. no. of  invoice line items you can have is 499.
    I dont think there is no work around for this
    Rgds

  • Create table more than 1000 colum

    Hello
    I have 1500+ column in my select query. how to create table like my query is
    create table xyz nologging as
    select a1,a2,a3,a4,a5,b1,b2,b3,b4,b5,................a1500
    from abcd a,bcde b
    where a.a1=b.b1
    In oracle 10g I can create only 1000 column in a table.
    Please suggest

    799301 wrote:
    My all columns are formulated with different different tables and want to create one intermediate table
    below query is just example
    select count(unique((case when sysdate >45 then video=Y and audio=N then 1 else 0 end))) vd_45,
    count(unique((case when sysdate >105 then video=Y and audio=N then 1 else 0 end))) vd_105,
    count(unique((case when sysdate >195 then video=Y and audio=N then 1 else 0 end))) vd_195,
    count(unique((case when sysdate >380 then video=Y and audio=N then 1 else 0 end))) vd_380,
    count(unique((case when sysdate >45 then video=N and audio=Y then 1 else 0 end))) ad_45,
    count(unique((case when sysdate >105 then video=N and audio=Y then 1 else 0 end))) ad_105,
    count(unique((case when sysdate >195 then video=N and audio=Y then 1 else 0 end))) ad_195,
    count(unique((case when sysdate >380 then video=N and audio=Y then 1 else 0 end))) ad_380
    from abcd a,bcde b
    where a.a1=b.b1;
    please suggest how create more than 1000 table columnBy creating a proper relation between the "formulated function" and stored value.
    Simplistically put:
    Do you model invoices as entities INVOICES_2010, INVOICES_2011 and so on?
    Or do you model it as INVOICES that has a year (date) attribute?
    It does not make sense to put attribute data into the name of entity. Nor does it make sense to put relationship data into the name of an attribute and create a 1000+ attributes as a result.

  • List with more than 1000 elements

    I'm trying to use this query:
    select *
    from street
    where street_cod in ('street_code_list')
    It working well when the 'street_code_list' has less than 1000 codes but it fails when it have more, with a message that says that a list can't have more than 1000 elements. How can I fix this problem? Can I use any other kind of object to store my list?
    I'm querying the database in ColdFusion using ODBC.
    Thank you in advance,
    Rui

    Another thing to consider doing is simply creating a "list table" which would contain your list elements and allow you to do a
    SELECT X
    FROM Y
    WHERE X IN (SELECT X
    FROM Z)
    type query.
    I don't know where you are getting your list from, but if it is persistant (i.e, doesn't change with every program run) this is certainly a viable option (and may be usable even if the list does change every run).
    If this is being executed in PL/SQL (or some other procedural language) you could also just loop through your list: execute the query with the first 1000 list elements and do your processing, then the next, and so on. Of course, to some extent this depends on your processing, but if you need the entire result set before you can process it, then you could store the individual result sets in a PL/SQL table or an array (if you are using Pro*C or whatever) and then process when the PL/SQL table/array was fully populated.

  • More than 1000 record in IN clause

    RDBMS : Oracle 10.2
    OS : CentOS 5
    Hi,
    I have a situation where I have to compare more than 1000 ids in my IN clause.
    I have used something like
    slect col,col2....
    from tble ......
    where .....
    and id in (--first 1000 ids ---)
    or id in (--next 1000 ids ---);
    the query is taking a lot time. If I omit second IN clause, it executes instantly .
    Can you please suggest how to handle such situation ?

    Take your full list of 2000 ids, paste them into a program like Notepad++ (or even regular notepad) then use the replace function to replace all commas with: " from dual union all select ". Add "with a as (select " to the front and " from dual)" to the end also add an alias like aid to the first item in the list. Paste that on the front of your query and change your "in" clause to say "id in (select aid from a)"
    Then instead of this:
    select col,col2
      from tble
    where id in (1,2,3,4,5);you'll use this:
    with a as (select 1 aid from dual union all
               select 2 from dual union all
               select 3 from dual union all
               select 4 from dual union all
               select 5 from dual)
    select col,col2
      from tble
    where id in (select aid from a);

  • More than 1000 Line items

    Hi
    User has done the billing for 1300  line items by using transaction code VF01. After completing that process there is a option TO release Accounting, when user select on that button system throwing an error "Maximum number of items in FI
    reached".
    We know that from FI side system will not allow more than 999 line items. Since we are using 4.7 version.
    Invoice have already send to the customer. So, How to solve this issue.
    Please provide me the solution as early as possible.
    Your help would really appreciated.
    Regards,
    Schilukuri

    Hi to everyone,
    My issue has been resolved.
    My requirement is User has done billing (VF01) for more than 1000 line items. After that he was doing that billing document to release to accounting. That time system was throwing an error message." Maximum number of items in FI
    reached"
    Solution:
    Use Transaction OBCY and give work area VBRK  and there you have to give Table name BSEG and give Field name whichever the fields you want to give to group the same line items.(Document may have some same line items).
    refer SAP Note:36353.
    Regards,
    Schilukuri

  • More than 1000 values

    Dear All,
    I am using oracle 10g with odp.net. I have values more than 1000 to be passed in the IN condition. How do i achieve this. When i pass the values it says only 1000 parameters are allowed in the IN condition.
    Please let me know.
    Regards
    Fkhan

    fkhan wrote:
    What i have thought of is to insert all the selected values in a global temporary table and then in my procedure use the subquery and use the values from the global temporary table.A better approach than using a massive IN list of values. But still problematic ito having a user interface deal with a 1000+ item/code selection. Still a performance problem shipping that amount of data across the network every single time from every single client running this app. And still a flawed ito data modelling and 3NF.
    The correct approach would be permanent tables to deal with this. In its basic and simplest form:
    // defines a keyword for a 1000+ items/codes
    KEYWORDS = ( keyword_id, keyword_description )
    // associates a keyword id with an item/code id
    KEYWORD_MAP = ( keyword_id, code_id )The user interface displays the keyword description. The select clause uses the keyword_id to select from the KEYWORD_MAP the associated list of values as filter criteria for the main query (using a join or IN sub-select for example).
    If the keyword is more complex than this basic approach (e.g. partially dynamic and specific per user), then this basic approach can be extended to serve as a "+favourites per user+" list (kind of like bookmarks, but per individual user). Or designed as to cater for the business requirements.
    But as you have (very simplistic I realise) describe the problem so far, I do not see a problem as much as a basic flaw in your code and data model.
    A user interface for entering and selecting 1000+ codes is clunky. And slow. Sending that data across to Oracle is slow. Inserting that each time around into a temp table for use, is slow.
    There's a lot of moving parts here. And that always contributes in degrading performance and making stuff complex (and usually unnecessarily so).

  • More than 1000 columns

    In which database i can create more than 1000 columns in a table

    Hi.
    I can´t imagine a problem in the real life where you need such number of columns.
    I can´t imagine how you will administer such object.
    I think that you don´t know the database normalization concept.
    There are a lot of documentation out there about this concept, search in the web "database normalization" and take a minute for reading.
    Regards.
    Daniel
    [If you think my writing is bad, you must listen to me speaking! ;-) ]

Maybe you are looking for

  • Why is my iPhone 5S randomly vibrating?

    Since getting my new iPhone 5S a few days ago, it has been continually randomly playing a Ding tone like I was getting an incoming e-mail message. I checked all of my apps and their sounds, and the general sounds, and it appeared that the only app th

  • External monitor not detected

    I just got a Samsung SyncMaster 2033 SW Plus monitor and tried connecting through the DVI port of my MacBook Pro, but its not getting detected. I'm using the mini display port to DVI adapter to connect. The monitor seems to work perfectly when I conn

  • Trouble with Drawing Markups in Acrobat Pro XI

    Hello all and thank you for any help that comes out of this. I use Acrobat XI for several forms for work.  They are timesheets.  I created these forms from another PDF that was given to me.  Part of the use of these forms involves adding Drawing Mark

  • Trying to understand validateFloatRange incorrect behaviour

    Hello, I'm validating a form containing various types of data. One of the fields to be validated is an amount which must be between 0.01 and 999999.99. The code for this field validation in the validation.xml is; <field property="amount" depends="req

  • Hi I can't see audiobooks on my library options? can anyone help please?

    Hi I copied my library to a hard disc and pointed iTunes to it via preferences and the audio books option doesn't now seem to exist in library. The audio books are still there, i can see the file if i go in though finder. Can anyone help please?