Sizing of BW System

Hi BW Experts,
We are facing a lot of problems with less memory in BW System. We are unable to run the jobs daily.
So i have stopped all process chains and other Back ground jobs on 24/09/2007.
I have not yet scheduled the jobs again from 24/09/2007 onwards.
We interacted with Basis people, and they requested us to do Sizing of the BW System.
And we are not aware how to do Sizing.
Now we are unable to load even 30 records also.
Should i delete the Requests from PSA ?
If i delete the PSA Requests, will BW System memory increase more?
How to do the sizing of the BW System?
Please give me the solution.
Thanks in advance.
Regards,
Anjali

Hi,
first call transaction DB02 and check the current sizes.
If your DB is full then try to delete some data, for example test data or PSA data created by datamart scenarios, PSA created when loading master data full, ODS change logs.....
for the siting, goto service.sap.com/bi, media library / performance / hardware sizing.
hope this helps...
Olivier.

Similar Messages

  • IPlanet performance information required (for sizing a new system)

    iPlanet performance. I am currently sizing a nre system and need some information around the performance of iPlanet. For example, the number of pages served against CPU usage. Does anyone know if there is a standard out there that records this information (similar to TPC for database performance). Cheers. Neil

    Hi,
    If you are planning to go  with EP 7, then installation master guide can help you.For EP 7 you need
    1)Oracle 10.2.0.2
    2)WAS java or ABAP
    3)EP installation master DVD(sapinst.exe)
    4)Support Packs (for e.g SP 10)
    This installation has KMC inbuilt that means you need not to install unlike EP6,Once installation is done you need to apply for license on https://service.sap.com --> key & requests,on this request,SAP will send you license file, it will be a TXT file, you only need to upload it on your server.
    Hope this helps,
    Regards,
    Ameya

  • Sizing SAP Quality System ?

    Hello,
    Is anybody know what is the good method to size a SAP Quality system  (mySAP ERP 2005 and BI 7.0 (BI-JAVA)) ?
    Thanks for reply.
    Bye.

    Hello,
    use the SAP standard sizing tool http://service.sap.com/sizing and
    check the below mentioned link also for more details
    https://websmp103.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000671158& --> Hardware Sizing
    Hope this will help you.
    Do revert for any more info.
    Best Regards,
    Sachin.

  • Guidelines on sizing IS-U system with IDE turned on

    hello! does anyone have any documentation on how to do sizing for an IS-U system that will utilize the IDE feature? Like what specific items do we need from the customer  and how do we use these items as factor in sizing (e.g. X number point of delivery would typically require x MB of memory and x MB of disk space).
    Alternatively, is there a specific SD or FI process that we can liken a specific IDE deregulation process (example: Change in supplier is equivalent to creating an order in the SD module?).

    Hi
    I would suggest that SAP are probably the best people to talk to regarding this. When you consider sizing IS-U with IDE there are a number of things to consider:
    - The size of your customer base and as you mention the the number of PODs you expect to have in the system.
    - The components/functionality you are planning to use in IDE.
    - The number of processes and the volume of industry flows you expect to send between yourselves and the market participants.
    - The exceptions scenerio you may encounter and how these are going to be managed, i.e. workflow, EMMA/BPEM.
    - Are you connecting IS-U to SAP CRM?
    This is probably not an exhaustive list but hopefully will help.
    Best Regards,
    Trevor

  • Sizing a standby system for Maximum Availability

    - Having primary and secondary sites with the SAME database server configuration. The server configuration will either be a SMP system with lots of processors or a RAC consisting of a number of
    4-way SMP processors.
    - Data Guard shall be used for the standby "functionality". From a functional point of view, the standby can be either physical or logical.
    - The application is very "update/insert intensive" (~50%).
    According to the documentation I've read, the recommendation seems to be to use physical standby,
    but I am a little worried about performance using
    a multiprocessor system for the "Log Apply Service".
    If have understood it right, the RMAN process is single-threaded and can only exist in one instance, while the
    LSP process manage several processes/execution threads.
    Is it therefore possible that logical standby may have greater throughput than physical standby on a SMP system, or is the "Managed Recovery" always more effective".
    Will I have problems wichever solution I chose, i.e
    are there other "single-threads" that are likely
    to become bottlenecks.
    /jan

    I think in u'r case standby managed will be a secured option b'coze when ever archives get generated it will be applied there won't be a archive loss

  • System Image Restore Fails "No disk that can be used for recovering the system disk can be found"

    Greetings    
                                        - Our critical server image backup Fails on one
    server -
    Two 2008 R2 servers. Both do a nightly "Windows Server Backup" of Bare Metal Recovery, System State, C:, System Reserved partition, to another storage hard drive on the same machine as the source. Active Directory is on the C: The much larger D: data
    partition on each source hard drive is not included.
    Test recovery by disconnecting 500G System drive, booting from 2008R2 Install DVD, recovering to a new 500G SATA hard drive.
    Server A good.
    Server B fails. It finds the backed-up image, & then we can select the date we want. As soon as the image restore is beginning and the timeline appears, it bombs with "The system image restore failed. No disk that can be used for recovering the system
    disk can be found." There is a wordy Details message but none of it seems relevant (we are not using USB etc).
    At some point after this, in one (or two?) of the scenarios below, (I forget exactly where) we also got :
    "The system image restore failed. (0x80042403)"
    The destination drive is Not "Excluded".
    Used   diskpart clean   to remove volumes from destination drive. Recovery still errored.
    Tried a second restore-to drive, same make/model Seagate ST3500418AS, fails.
    Tried the earliest dated B image rather than the most recent, fail.
    The Server B backups show as "Success" each night.
    Copied image from B to the same storage drive on A where the A backup image is kept, and used the A hardware to attempt restore. Now only the latest backup date is available (as would occur normally if we had originally saved the backup to a network location).
    Restore still fails.         It looks like its to do with the image rather than with the hardware.
    Tried unticking "automatically check and update disk error info", still fail.
    Server A  SRP 100MB  C: 50.6GB on Seagate ST3500418AS 465.76GB  Microsoft driver 6.1.7600.16385   write cache off
    Server B  SRP 100MB  C: 102GB  on Seagate ST3500418AS 465.76GB  Microsoft driver 6.1.7600.16385   write cache off
    Restore-to hard drive is also Seagate ST3500418AS.
    http://social.answers.microsoft.com/Forums/en-US/w7repair/thread/e855ee43-186d-4200-a032-23d214d3d524      Some people report success after diskpart clean, but not us.
    http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/31595afd-396f-4084-b5fc-f80b6f40dbeb
    "If your destination disk has a lower capacity than the source disk, you need to go into the disk manager and shrink each partition on the source disk before restoring."  Doesnt apply here.
    http://benchmarkreviews.com/index.php?option=com_content&task=view&id=439&Itemid=38&limit=1&limitstart=4
    for 0x80042403 says "The solution is really quite simple: the destination drive is of a lower capacity than the image's source drive." I cant see that here.
    Thank you so much.

    Hello,
    1. While recovering the OS to the new Hard disk, please don't keep the original boot disk attached to the System. There is a Disk signature for each hard disk. The signature will collide if the original boot disk signature is assigned to the new disk.
    You may attach the older disk after recovering the OS. If you want to recover data to the older disk then they should be attached as they were during backup.
    2. Make sure that the new boot disk is attached as the First Boot disk in hardware (IDE/SATA port 0/master) and is the first disk in boot order priority.
    3. In Windows Recovery Env (WinRE) check the Boot disk using: Cmd prompt -> Diskpart.exe -> Select Disk = System. This will show the disk where OS restore will be attempted. If the disk is different than the intended 2 TB disk then swap the disks in
    correct order in the hardware.
    4. Please make sure that the OS is always recovered to the System disk. (Due to an issue: BMR might recover the OS to some other disk if System disk is small in size. Here the OS won't boot. If you belive this is the case, then you should attach the
    bigger sized disk as System disk and/or exclude other disks from recovery). Disk exclusion is provided in System Image Restore/Complete PC Restore UI/cmdline. 
    5. Make sure that Number and Size of disks during restore match the backup config. Apart from boot volumes, some other volumes are also considered critical if there are services/roles installed on them. These disks will be marked critical for recovery and
    should be present with minimum size requirement.
    6. Some other requirements are discussed in following newsgroup threads:
    http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/871a0216-fbaf-4a0c-83aa-1e02ae90dbe4
    http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/9a082b90-bd7c-46f8-9eb3-9581f9d5efdd
    http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/11d8c552-a841-49ac-ab2e-445e6f95e704
    Regards,
    Vikas Ranjan [MSFT]
    ------- This information is provided as-is without any warranties, implicit or explicit.-------

  • Windows 7 as ONLY operating system

    I would like to install Windows 7 in an Intel Mac mini as the PRIMARY/ONLY operating system (without Mac OS or boot-camp). I love my Macintosh and Macintosh hardware in general, but I need to provide a small-sized Windows 7 system for a very non-technical person, and would like to use Macintosh hardware if possible. Has anyone tried this, and if so, what happened? Thanks...

    Hi all,
    Nicholas: the hatter already has posted some answers to the same question you asked in the Windows Compatibility Forum.
    Please don't cross-post in this Forum, it is considered to be not very polite.
    While it is possible to have/make a Windows-only Mac (with or without the help of BootCamp) I personally would suggest to have at laest some small OSX partition on it as well.
    Something around 15-20GB is sufficient with OSX Sbow Leopard when leaving out any additional software installations.
    Apple sometimes releases Firmware and SMC updates for Macs that only install if run from an internal OSX partition and won't run if the OSX is on an external harddisk.
    And they definitely won't run in Windows.
    Regards
    Stefan

  • GRC system validation steps

    Hi, Please check and provide GRC system validation steps in serial. I am looking for the serial steps which we need to follow while validating GRC system after applying support packs or upgrade etc. Appreciate your feedback. Thanks & Regards, Koteswara Rao.

    Hello Gurus,
    I too am looking for similar information so I am posting my question in the same thread.
    I was going through the GRC 5.3 sizing guide for figuring out the hardware requirement for RAR. We wanted to go with the expert sizing of RAR. Expert sizing takes into account the factors such as no.of users, roles and violations. I thought we can easily figure out the no.of roles and no.of users but how do I estimate "Number of SOD Violations"? Please advice..
    @ Sanjay,
    I will just outline of what we did so far..hope this might help you
    Here at our company, we started off by 1) buying the GRC access control suite, 2) deciding what components within GRC will be implemented first. 3) systems (ECC, BI, etc) in scope
    4) System landscape (Dev and Test will be on one box, Prod will be on another box)
    5) Sizing based on systems in scope
    6) Decide whether to install on existing Solman or install on a new box...(bcos all that GRC 5.3 needs is NW04s Java stack)
    7) Buy hardware based on sizing, Install NW
    and so on...
    Thanks in advance for your input...
    regards,
    Venkateswara Rao

  • Lot-sizing procedures

    Hi Experts,
    Can anybody please explain difference between following lot-sizing procedures.
    static lot-sizing procedures
    period lot-sizing procedures
    optimum lot-sizing procedures
    It will very much helpful if explain with example. Will appreciate with deserving point.

    Dear Raja,
    Pls. find details about Lot Sizing Procedures :
    Static Lot-Sizing Procedures
    Use
    In static lot-sizing procedures, the procurement quantity is calculated exclusively by means of the quantity specifications entered in the material master.
    Features
    The following static lot-sizing procedures are available:
    Lot-for-lot order quantity
    Fixed lot size
    Fixed lot size with splitting and overlapping
    Replenishment up to maximum stock level
    Period Lot-Sizing Procedures
    Use
    In period lot-sizing procedures, the system groups several requirements within a time interval together to form a lot.
    Features
    You can define the following periods:
    days
    weeks
    months
    periods of flexible length equal to posting periods
    freely definable periods according to a planning calendar
    The system can interpret the period start of the planning calendar as the availability date or as the delivery date.
    Splitting and overlapping are also possible for all period lot-sizing procedures
    The system sets the availability date for period lot-sizing procedures to the first requirements date of the period. However, you can also define that the availability date is at the beginning or end of the period.
    Optimum Lot-Sizing Procedures
    Use
    In static and period lot-sizing procedures, the costs resulting from stockkeeping, from the setup procedures or from purchasing are not taken into consideration. The aim of optimum lot-sizing procedures, on the other hand, is to group shortages together in such a way that costs are minimized. These costs include lot size independent costs (setup or order costs) and storage costs.
    Taking Purchasing as an example, the following problem hereby arises:
    If you order often, you will have low storage costs but high order costs due to the high number of orders. If you only seldom place orders then you will find that your order costs remain very low, but your storage costs will be very high since warehouse stock must be large enough to cover requirements for a much longer period.
    Features
    The starting point for lot sizing is the first material shortage date that is determined during the net requirements calculation. The shortage quantity determined here represents the minimum order quantity. The system then adds successive shortage quantities to this lot size until, by means of the particular cost criterion, optimum costs have been established.
    The only differences between the various optimum lot-sizing procedures are the cost criteria. The following procedures are available:
    Part Period Balancing
    Least Unit Cost Procedure
    Dynamic Lot Size Creation
    Groff Reorder Procedure
    Hope this helps.
    Regards,
    Tejas

  • Windows system questions

    I've got an old Windows XP system that just doesn't cut it for Lightroom and I need to upgrade. I've been researching various systems and I'm trying to determine what the best system would be for Lightroom.
    I've got a budget of about $3k to $4k for a new CPU. I think that should allow me to get a pretty good system.
    Here's my questions:
    1) Should I look at one of the "gaming" systems like the XPS or Blackbird, or are they not well suited to Photoshop and Lightroom?
    2) Should I consider running the 64 bit version of Vista? Does LR & PS CS3 support 64 bit or will they support it soon?
    3) I could get a Dell Precision workstation with dual Xeon processors for about the same cost, would it be better or worse than the XPS style systems?
    For the moment I'm only addressing Lightroom, Photoshop and DW CS3. I realize that the 64 bit OS would probably not support some of my other software, but I'm not a gamer, so I don't really care if the latest "shoot-em-up" doesn't run.
    I also realize that spending about $4K for a gaming rig will only get me a moderately fast system, but I don't have the $10K for a FalconNW tricked out box with liquid cooling and custom paint job. What's my best option that doesn't require a second mortgage?

    Hi Bob,
    Yeah, if you are looking at the latest/greatest already, than it would be hard to beat the specs. But you could still win handily on the over all price.
    One of the things I am considering for my next system (with in 12 months) is a new HD scheme based on ssd. Here is my initial though on that topic so far:
    A separate ssd for the following:
    System Disk (32-64gb)
    LR DB disk (16-64 gb)
    pagefile disk (8-32gb)
    would continue to use old fashion (external)HD's for data storage/backup
    My main reason for this is 4 fold:
    1) Heat generation
    2) power consumption
    3) disk access speed
    4) size off disks vs overall size of case ( I like small cases)
    My main imaging computer is only used for imaging apps. So I do not need a huge system disk ( I use a laptop for everything else) Prices are falling quickly on ssd's and the industry expects prices to be nearly half of todays prices with in a year (or so)
    As for ram, do your home work. As of recently (this may change with intels newest offerings) Intel systems prefer higher bandwidth, while AMD systems like faster latency timings. Just getting higher bandwidth ram may not do as much as you hope. So make sure that you do your research for the complete system you want to build. I am no expert here, I just do LOTS of home work before I fork over cash for my parts.
    As for processor, you mentioned not wanting to OC your system. Well, don't overlook this possibility. At least with AMD processors (again this is recent) you can go with a lower $ value (slower speed) processor and oc it mildly to meet the speed of the more costly, speedier, model. And this holds true for their higher end chips as well. As much as you can push a slower one you can push a natively faster one as well. All this can usually be done with nothing more that a good heat sink/fan. I have been running with 15% oc for the past year and have had no problems. ( and I am only using the stock heat sink/fan in a very small case, better cooling and bigger case I am sure I could safely do 25%+)
    Just some thoughts...
    oh, one other thing: Power supplies. Make sure you get one that is properly sized for you system. It will run more efficiently and save you bucks every month!! There are websites that give you details on how to calculate proper power supply size. Just don't get sucked into the bigger is better idea with them.

  • Something's still not right

    ???-)I just got this system set up over the weekend and eveything works fine, BUT...I have a few questions about my BIOS (amibios new setup utility-ver 3.31a).  In Standard CMOS features, I have no primary IDE master or slave.  My 2 CD drives are the secondary master and slave, and my S-ata hard drive is the third IDE master.  Shouldn't this be the primary, or is it OK here?  Next...in Advanced BIOS features, my floppy is not listed in boot device select.  Is something else set up wrong elsewhere?  And my last question...What should my AGP aperture be set at?  For now it is at 64mb.  What does it do?  And should it be higher if my graphics card has 256mb?  Thanks in advance.
    Doug S.

    How big should I set AGP Aperture size in my BIOS?
    First of all, AGP Aperture memory will not be used until your video card's on-board memory is running low. That means it will usually not impact your gaming performance because developers are trying hard to not exceed the on-board memory limits.
    The bigger your video memory, the smaller your Aperture Size could be. However with later games requiring more and more texture memory a good number seems to be 128MB Aperture Size for all cards with 64 MB to 256 MB Video RAM.
    Setting the Aperture Size to HUGE values will not increase performance because this merely sets the maximum amount of physical memory that can be used. It only makes the GART Table bigger because every 4K page has its own entry, no matter if allocated or not.
    Setting the Aperture Size to too small values could result in running out of available texture memory especially on a low-mem video card. It is also possible that developers make use of the GART's features by creating textures as 'non-local'.
    If you experience in-game stuttering try playing with the size of your Aperture.
    What is it from a technical point of view?
    When using an AGP card the video memory on the graphics adapter is mapped into the 4 GB memory address space (above the region of the physical installed memory). Any accesses to this memory region are directly forwarded to the video memory, greatly increasing transfer rates. However in earlier days of video cards graphics memory was rather limited and ran out quickly (a single 32-bit 512x512 MIP-mapped texture consumes ~1.5 MB) so AGP added a mechanism to use the system's main memory as additional storage for graphics data such as textures. This is what the AGP Aperture is. Usually directly below the mapped video memory the system reserves a contiguous space of addresses the size of your Aperture (no physical memory will be consumed at this time).
    When free video RAM is running low the system dynamically allocates 4K sized pages of system memory for use as AGP Aperture Memory. The problem with this dynamic allocation is that in many cases the pages are spread in a non-contiguous form throughout the physical memory. Accessing these pages directly would hinder performance because of scattering/gathering requiring extra logic. To get around this limitation the GART (Graphics Address Remapping Table) which is implemented in hardware in the Northbridge's Memory Controller Hub provides an automatic physical-to-physical mapping between the scattered pages and the AGP Aperture. See the following illustration:
    The actual usable amount of this 'virtual' AGP memory is less than half the AGP Aperture size set in the BIOS. This is because the Aperture is divided into two areas. One uncached half and another write-combined area.

  • Installation of BOBI.

    Hi,
    Our Company has decided to implement BOBI.
    Currently we are on ERP6 EHP4 with Abab stack only.
    Since BOBI will require the java stack  but in EHP4 add-in installation is not possible.
    So should we install a new system with abap + java stack upgrade to EHP4 then install BOBI.
    I am confused ,whether we will have to the configuration again for system.( All data, zreports, interface)
    Please advice how we can install BOBI on our current landscape.
    Regards
    Abhishek.

    Hi,
    While installating JAVA system ,will it be seperate ERP system with its own DB
    NO its not an ERP system , its a j2ee engine only and will be accesible using web browser not SAP GUI.
    has to be connected using JCO
    If you want to setup connectivity between Java engine and ERP (Existing) , yes Jco can be used.
    Will the sizing of this system(JAVA) be similar to ABAB stack ERP system.?
    No its different , for sizing of java engine you can refer http://service.sap.com/sizing
    As an Idea DB size will be between 10-20 GB only .
    Regards,

  • Seeburger As2 Receiver channel exception for large files

    Hello Folks,
    We have JMS to seeburger AS2 interface we are facing follwing issue in As2 reciver channel for files larger than 20 MB.But In Production it is wokring fine for even 40 MB also.
    Delivering the message to the application usingconnection AS2_http://seeburger.com/xi failed, due to:
    com.sap.engine.interfaces.messaging.api.exception.MessagingException:javax.resource.ResourceException: Fatal exception: javax.resource.ResourceException:SEEBURGER AS2: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. # , SEEBURGER AS2:org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosingrequest can not be repeated. # .
    Please through some light on the issue .
    Regards
    Praveen Reddy

    Hi Praveen,
    The problem would be related to server size. genrally test system do not have same sizing as production server due to that you can not process large files in test system.
    check the sizing of the system.
    regards,
    Harish

  • Quicksizer for ADS 7.02 and SLD 7.02

    Dear all,
    We have installed a 2 systems, an ADS and SLD, both on a SAP NetWeaver 7.02 platform.
    Now, we have to install both of them on QAS and PRD environement.
    Question :
    Using QuickSizer tools on http://service.sap.com/quicksizer, I can see that there is no specific option to analyze
    ADS and SLD hardware requirement.
    Any suggestion is welcome
    Best regards
    SAP NetWeaver Admin

    Pascal Cuennet wrote:
    Dear Sunny,
    > Dear Siva,
    >
    > Portal is made of an AS-JAVA + EP + EP Core
    > ADS and SLD are only composed of an AS-JAVA.
    >
    > I cannot size ADS or SLD as an EP.
    >
    > Best regards
    > SAP NetWeaverBC
    You are right. Then why you want to do sizing for ADS and SLD. Could you please explain the reason for doing sizing of ADS and SLD ? I would do sizing of portal system based on usage scenario like BI, ESS/MSS etc. not for ADS and SLD.
    Thanks
    Sunny

  • COPA vs. BCS design decisions (ex. profitability by customer in BCS)

    We are trying to meet a business goal of identifying gross profit by customer.
    We realize "customer" as a field in BCS is problematic, so we are thinking of only storing certain customers in BCS with a catch-all "Others" customer - with the goal of keeping the BCS data volume reasonable.
    Consider the scenario: US company sells material X qty 1 to Spain company for 100 with cost of 30 (therefore profit of 70)
    Spain sells same material X qty 1 to third-party customer for 120.
    Spain from a local perspective profits 20, however the group from an overall perspective profits 90 ( the US revenues of 100 eliminate against the Spain COGS of 100 so you are left with revenue of 120, COGS 30, profit 90 - from the group perspective ).
    We want to know how to see, on a customer level, the 90 profit from these transactions. 
    We do not believe COPA can do this, can this may be accomplished in BCS? 
    If you do a "one-sided" elimination (elimination driven by the revenue side only) of the intercompany revenue the system would not be able to reference customer on the elimination. We are wondering if this scenario of analyzing overall profit by customer can be accomplished by BCS functionality and are particularly interested in knowing what functionality you used to accomplish this requirement and in what sequence within the BCS close (BCS monitor).
    Thank you in advance for any input you may have.
    Also we are interested in any opinions/comments anyone may have about design decisions regarding BCS vs. COPA in BW.  BCS business content identifies a sample design for a BCS data model including item, company, movement type, trading partner, functional area, etc.  COPA (as configured in R3/ECC and extracted to BW) commonly features analysis by customer, material, etc.  Considering BCS features elimination functionality, what design concerns have people faced with respect to fields that they include in both reporting systems?  Obviously a prominent concern is sizing of the systems, but what common characteristics has anyone decided to feature in both systems? What considerations drove the decisions as to what common characteristics to feature in both BCS and COPA?

    Hi John,
    Reg, your last question - might be useful info in here, if you have not seen it yet:
    Re: Reports using COPA cube, BCS Cube

Maybe you are looking for

  • [Solved] Xorg 1.5 screwed up my NVIDIA

    Hi, Today I did a pacman -Syu and I got the newest NVIDIA driver installed and the newest Xorg. Everything broke. Well not exactly everything. But my settings are all screwed up. For a start my screen refuses to load at the normal resolution (1680x10

  • Microsoft office 2008 and OS X 10.8.3

      microsoft office 2008 worked on my i-mac until I upgraded to lion 10.8.3. any easy fix?

  • Help needed - transaction atomicity did not happen

              Hi -           We recently encountered a strange problem. Due to a bad search query in another           application, our Oracle RDBMS was running our of SGA and causing transactions           to fail.           What makes us worry more is

  • Flex binding to an xml attribute

    Does flex binding allow using an xml attribute? I can't seem to get it to wkr. the code i am using is below: <mx:XMLList id="gridData"> <rssFeeds> <feed label="label1" urlText=" http://www.url1.com/index.rdf" /> <feed label="label2" urlText=" http://

  • Why is no one worried about virus software for ipod touch

    Everyone seems to think no virus scan is required for ipod touch.If you surf the web why would no virus be able to attack?