Steps for loading data into the infocube in BI7, with dso in between

Dear All,
I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
Top to bottom:
InfoCube (Customized)
Transformation
DSO (Customized)
Transformation
DataSource (Customized).
The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
Kindly advise me, where did i miss.
OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
Regards,

Hi,
my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
Hope this helps.
Kind regards,
Jürgen

Similar Messages

  • Adding data into the InfoCube

    i have follow a step by step guide to do this BI solution but i can load the data into the info. is there any way to load data into the info cube?? thanks for your help.

    Hi,
    Please follow the below steps to load your data
    Here are the steps,
    1.Open Administrator Workbench: Modeling, from the menu or using the transaction RSA1
    2.Go to Source Systems (File)and Create a new System.
    3. Double click on the datasources, create data source
    4.Entitle the Datasource, choose Transaction Data as Data Type Datasource
    5. In next screen, Go to Extraction , Activate the data source
    6.Go to Preview. Press Read Preview Data.
    7.Data will be loaded: Save and activate the datasource
    8.Go to InfoProvider, go to Info area and create Infocube
    9.Entitle the infocube, press create
    10.Display all info objects: . Choose your info objects ,Move the characteristics and key figure to the infocube by drag and drop adn activate
    11.Go to InfoCube, create transformation
    12.Click on the ‘Show Navigator’ button. Match the fields in navigator. Save and activate.
    13.go to info provider, your info cube, create data transfer process
    14.Choose: Extraction Mode: full
    Update: Valid records update, Reporting possible (Request green)
    Activate, execute
    14.Double click on the data source, Read Preview Data
    15.Create an InfoPackage by right clicking on the datasource. Execute the load into PSA.
    16.Verify data was loaded into PSA before proceeding with the next step.
    17. Execute the Data Transfer Process
    18.Monitor the results (Menu: GoTo: DTP Monitor)
    19.Try to display the data at the BW Frontend.
    Hope this is helpfull.
    Harish

  • How can i retrieved data into the infocube from archived files

    hi,
    i have archived cube data and i have to load data into the cube from archived files.
    so now i want to find archived files and how to load data into the cube.
    thanks

    Hi.....
    Reloading archived data should be an exception rather than the general case, since data should be
    archived only if it is not needed in the database anymore. When the archived data target is serving also as a
    datamart to populate other data targets, Its recommend that you load the data to a copy of the original
    (archived) data target, and combine the two resulting data targets with a MultiProvider.
    In order to reload the data to a data target, you have to use the export DataSource of the archived data
    target. Therefore, you create an update rule based on the respective InfoSource (technical name 8<data
    target name>). You then trigger the upload either by using ‘Update ODS data in data target’ or by
    replicating the DataSources of the MYSELF source system and subsequently scheduling an InfoPackage
    for the respective InfoSource
    If you want to read the data for reporting or
    control purposes, you have to write a report, which reads data from the archive files sequentially.
    Alternatively, you can also use the Archiving Information System (AS). This tool enables you to define an
    InfoStructure, and create reports based on these InfoStructures. The InfoStructures define an index for
    the archive file data. At the moment, the archiving process in the BW system does not fill the
    InfoStructures during the archiving session automatically. This has to be performed manually when
    needed.
    Another way of displaying data from the archive file is by using the ‘Extractor checker’ (TCODE RSA3).
    Enter the name of the export DataSource of the respective data target (name of the data target preceded
    by ‘8’), and choose the archive files that are to be read. The extractor checker reads the selected archive
    files sequentially. Selection conditions can be entered for filtering but have to be entered in in internal
    format
    It will remain same in the change log table.
    Check this link :
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b32837f2-0c01-0010-68a3-c45f8443f01d
    Hope this helps you...........
    Regards,
    Debjani............

  • FDMEE Import error "No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'

    Hi,
    We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
    Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
    I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
    I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
    Also its only happening to one ledger rest all ledgers are working fine without any issues.
    Thanks

    Hi,
    there are some Support documents related to this issue.
    I would suggest you have a look to them.
    Regards

  • Error while loading data into Customers InfoCube

    Hi All,
    I am receicing the following error while loading the data into the following InfoCube 0SD_C01.
    The error I am receiving is "LIS Updating is Still Active".
    I am also receiving error similar to "The data is being edited at the time of loading the data request".
    Does anybody have encountered such a situation...
    Please help me out.
    Regards
    YJ

    Hi Dinesh and Paolo,
    The links are very useful.
    I went through the Transactions and found out that the Asynchronous Updating is going on for S001.
    So what are the steps that I can take as a BW Consultant so that the data is loaded into BW.
    Regards
    YJ

  • Can not load data into the ods.

    When I was trying to load data into BW3.5 from R3 system, I am getting this message
    "For InfoSource ZSD_A507_011, there is no Hierarchies for master data in source system R3PCLNT300
    "  any idea.  There is no hierarchies for this.

    Hello Mat,
    What are you trying to load??
    If the data source is a hierarchy datasource??
    just check this thread
    There are no hierarchies for this InfoSource in source system
    Thanks
    Ajeet

  • Procedure for loading data into gl_interface

    hi all
    iam new to oracle applications.....i just want to know how to load data into gl_interface table using pl/sql (that is...first loading data into a temporary table and then using pl/sql procedure to load into gl_interface). can anybody help me out with this by providing the pl/sql structure for it??
    thanx in advance

    Asuming you have data in a datafile and file is camma delimited. I asume table has two columns you can add more columns also.
    CREATE OR REPLACE PROCEDURE p10
          IS
          lv_filehandle                                UTL_FILE.FILE_TYPE;
          lv_iicc_premium                              ref_cursor;
          lv_newline                                   VARCHAR2(2000); -- Input line
          lv_header_line                               VARCHAR2(20);
          lv_trailer_line                              VARCHAR2(20);
          lv_file_dir                                  VARCHAR2(100);
          lv_file_name                                 VARCHAR2(100);
          lv_col1                                      VARCHAR2(10);
          lv_col2                                      VARCHAR2(50);
          lv_comma                                     VARCHAR2(1) := ',';
        BEGIN
           gv_PrFnName := '[pr_iicc_premium]';
           DELETE FROM temp_table;
            lv_file_dir  := 'p:\temp';
            lv_file_name  := 'test.dat';
            lv_filehandle := UTL_FILE.FOPEN (lv_file_dir, lv_file_name, 'r', 32766);
            UTL_FILE.PUT_LINE (lv_filehandle, lv_header_line);
            LOOP
               FETCH lv_iicc_premium INTO lv_newline;
               EXIT WHEN lv_iicc_premium%NOTFOUND;
               UTL_FILE.PUT_LINE (lv_filehandle, lv_newline);
               lv_col1 := substr(lv_newline, 1, instr(lv_newline, ',', 1)-1);
               lv_col2 := substr(lv_newline, instr(lv_newline, ',', 1)+1, instr(lv_newline, ',', 2)-1);
               INSERT INTO temp_table VALUES (lv_col1, lv_col2);
               COMMIT;
            END LOOP;
            INSERT INTO your_production_tables VALUES ( SELECT * FROM temp_table );
            COMMIT;
            UTL_FILE.FFLUSH (lv_filehandle);
            UTL_FILE.FCLOSE (lv_filehandle);
        EXCEPTION
           WHEN UTL_FILE.INVALID_PATH THEN
             RAISE_APPLICATION_ERROR(-20100,'Invalid Path');
           WHEN UTL_FILE.INVALID_MODE THEN
             RAISE_APPLICATION_ERROR(-20101,'Invalid Mode');
           WHEN UTL_FILE.INVALID_OPERATION then
             RAISE_APPLICATION_ERROR(-20102,'Invalid Operation');
           WHEN UTL_FILE.INVALID_FILEHANDLE then
             RAISE_APPLICATION_ERROR(-20103,'Invalid Filehandle');
           WHEN UTL_FILE.WRITE_ERROR then
             NULL;
           WHEN UTL_FILE.READ_ERROR then
             RAISE_APPLICATION_ERROR(-20105,'Read Error');
           WHEN UTL_FILE.INTERNAL_ERROR then
             RAISE_APPLICATION_ERROR(-20106,'Internal Error');
           WHEN OTHERS THEN
           UTL_FILE.FCLOSE(lv_filehandle);
        END p10;
    /Code is not tested.
    Hope this helps
    Ghulam

  • How to load data into an infocube from a flat file

    HI friends,
    Iam a learner of bw.
    I have master and transaction data to load into an info cube.
    I think we can load master data first, and then transaction data. that is master data we have to load through direct or flexible update and then transaction data through flexible update by preparing 2 excel sheets( one for master data and other for transaction data)
    Or
    can we load both at a time by preparing a single excel sheet which covers the values of characteristic, attributes and key figures.
    sai
    sai

    Hi Sai,
    Please find your replies below -
    I have master and transaction data to load into an info cube.
    >> You load master data in InfoObejcts and Transaction data in Cubes. So first Identify the Master data InfoObjects & InfoCubes<<
    I think we can load master data first, and then transaction data. << Yup good practice is to load master data first & then transaction data<<
    that is master data we have to load through direct or flexible update << you need to look into your scenario which one are you using??<<
    and then transaction data through flexible update by preparing 2 excel sheets( one for master data and other for transaction data) << Yup I think you will need separate file - Master data one for InfoObejcts data load & transcation data one for InfoCube data laod>>
    Or
    can we load both at a time by preparing a single excel sheet which covers the values of characteristic, attributes and key figures.
    Hope it helps
    Regards
    Vikash

  • Best Practise for loading data into BW CSV vs XML ?

    Hi Everyone,
    I would like to get some of your thoughts on what file format would be best or most efficient to push data into BW. CSV or XML ?
    Also what are the advantages / Disadvantages?
    Appreciate your thoughts.

    XML is used only for small data fields - more like it is easier to do it by XML rather than build an application for the same - provided the usage is less.
    Flat files are used for HUGE data loads ( non SAP ) and definitely the choice of data formats would be flat files.
    Also XML files are transformed into a flat file type format with each tag referring to the field and the size of the XML file grows to a high value depending on the number of fields.
    Arun

  • Error while loading data into the cube

    Hi,
    I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
    Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
    Also can some one explain the Datatransfer process(not in process chain)?
    Regards,
    Sam

    Hi Sam
        after you load the data  through DTP(after click on execute button..) > just go to monitor screen.. in that press  the refresh button..> in that it self.. you can find the  logs..
       otherwise.. in the request  screen also..  beside of the request number... you can see the logs icon.. you can click on this..
    DTP  means..
    DTP-used for data transfer process from psa to data target..
    check thi link..for DTP:
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    to load data in to the datatargets or infoproviders like DSO, cube....
    in your  case.. the problem may be.. check the date formats.. special cherectrs... and
    REGARDS
    @JAY

  • Interface for loading data into customer products

    I am trying to find if there is any interface for loading Customer products in Install Base apart from entering them manually. I understand that this exists in Oracle Apps version 11.5.7 but I need this for version 11.5.4.

    Hi,
    In 11.5.4 , you've to write yourself the loader, using Oracle standard API.
    I've performed it, it is working fine.
    Hugues

  • AFTER LOADING THE DATA INTO THE INFOCUBE IT SHOULD NOT DISPLAY IN REPORTING

    Hi experts,
                     I have the data in infocube. When i selected the cube in reporting it should not display i.e; the data should not display in reporting.
    thanks nd regards,
    venkat

    Bit strange requirement. But if you have data in Cube and automatically its will be available for Reporting. unlike ODS, we do not have option of "Automatic request activation".
    However on repoeting side, you can restrict the values to #  and exclude everything from slection. But not good solution as such.
    Regards
    Pankaj

  • Short Dump while loading data into the GL ODS

    Hello Guyz
    I am getting a database error while loading the GL ODS in BW. I am performing an INIT load, and the error occurs during the activation stage:
    Database Error

    Hello Atul,
            PSAPTEMP has nothing to do with PSA. It is a temporary storage space and is present in all systems.
    You can check this in transaction DB02 if you are using Bw 3.x and DB02OLD if you are using BI 7.0. Once in DB02/DB02OLD, you will see a block which says tablespaces. In this block, there is a push button for freespace statistics, click on it and it will take you to a screen which shows the free space for all tablespaces.
    Hope this helps,
    Regards.

  • Exception while loading data into the cache

    I'm getting the following error while attempting to pre-populate the cache:
    2010-11-01 16:27:21,766 ERROR [STDERR] (Logger@9229983 n/a) 2010-11-01 16:27:21.766/632.975 Oracle Coherence EE n/a <Error> (thread=DistributedCache, member=1): SynchronousListener cannot be added on the service thread:
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.addMapListener(PartitionedCache.CDB:14)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.addMapListener(PartitionedCache.CDB:1)
         at com.tangosol.coherence.component.util.SafeNamedCache.addMapListener(SafeNamedCache.CDB:27)
         at com.tangosol.net.cache.CachingMap.registerListener(CachingMap.java:1463)
         at com.tangosol.net.cache.CachingMap.ensureInvalidationStrategy(CachingMap.java:1579)
         at com.tangosol.net.cache.CachingMap.registerListener(CachingMap.java:1484)
         at com.tangosol.net.cache.CachingMap.get(CachingMap.java:487)
         at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.isEnabled(AbstractGpsCacheStore.java:54)
         at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.store(AbstractGpsCacheStore.java:83)
         at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.store(ReadWriteBackingMap.java:4783)
         at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.storeInternal(ReadWriteBackingMap.java:4468)
         at com.tangosol.net.cache.ReadWriteBackingMap.putInternal(ReadWriteBackingMap.java:1147)
         at com.tangosol.net.cache.ReadWriteBackingMap.put(ReadWriteBackingMap.java:853)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.put(PartitionedCache.CDB:98)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:41)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest.onReceived(PartitionedCache.CDB:90)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:619)
    2010-11-01 16:27:21,829 ERROR [STDERR] (pool-14-thread-3) Exception in thread "pool-14-thread-3"
    2010-11-01 16:27:21,829 ERROR [STDERR] (Logger@9229983 n/a) 2010-11-01 16:27:21.766/632.975 Oracle Coherence EE n/a <Error> (thread=DistributedCache, member=1): Assertion failed: poll() is a blocking call and cannot be called on the Service thread
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.poll(Grid.CDB:5)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.poll(Grid.CDB:11)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.get(PartitionedCache.CDB:26)
         at com.tangosol.util.ConverterCollections$ConverterMap.get(ConverterCollections.java:1559)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.get(PartitionedCache.CDB:1)
         at com.tangosol.coherence.component.util.SafeNamedCache.get(SafeNamedCache.CDB:1)
         at com.tangosol.net.cache.CachingMap.get(CachingMap.java:491)
         at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.isEnabled(AbstractGpsCacheStore.java:54)
         at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.store(AbstractGpsCacheStore.java:83)
         at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.store(ReadWriteBackingMap.java:4783)
         at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.storeInternal(ReadWriteBackingMap.java:4468)
         at com.tangosol.net.cache.ReadWriteBackingMap.putInternal(ReadWriteBackingMap.java:1147)
         at com.tangosol.net.cache.ReadWriteBackingMap.put(ReadWriteBackingMap.java:853)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.put(PartitionedCache.CDB:98)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:41)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest.onReceived(PartitionedCache.CDB:90)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:619)
    2010-11-01 16:27:21,922 ERROR [STDERR] (pool-14-thread-3) (Wrapped: Failed request execution for DistributedCache service on Member(Id=1, Timestamp=2010-11-01 16:20:09.372, Address=169.65.134.90:7850, MachineId=28250, Location=site:EMEA.AD.JPMORGANCHASE.COM,machine:WLDNTEC6WM754J,process:5416) (Wrapped: Failed to store key="9046019") poll() is a blocking call and cannot be called on the Service thread) com.tangosol.util.AssertionException: poll() is a blocking call and cannot be called on the Service threadI'm a bit stumped as my code doesn't call poll() anywhere and this appears to be caused by the following in my CacheStore class:
    public void store(Object key, Object value) {
            log.info("CacheStore currently " + isEnabled());
            if (isEnabled()) {
                throw new UnsupportedOperationException("Store method not currently supported");
    public boolean isEnabled() {
            return ((Boolean) CacheFactory.getCache(CacheNameEnum.CONTROL_CACHE.name()).get(ENABLED)).booleanValue();
        }the only thing I can think of is maybe it has a problem calling a cache from within a CacheStore (if that makes sense). What I have is a CONTROL_CACHE which just stores a boolean value to indicate whether the store(), storeAll(), erase(), and eraseAll() methods should do anything. Is this correct?

    Hi Jonathan,
    I am trying to implement a write-behind cache but my configs may be wrong. The config for the cache with the cachestore looks like:
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd" >
    <cache-config>
         <caching-scheme-mapping>
         <cache-mapping>
              <cache-name>PARTY_CACHE</cache-name>
              <scheme-name>party_cache</scheme-name>
         </cache-mapping>
         </caching-scheme-mapping>
         <caching-schemes>
              <near-scheme>
                   <scheme-name>party_cache</scheme-name>
                   <service-name>partyCacheService</service-name>
                   <!-- a sensible default ? -->
                   <thread-count>5</thread-count>
                   <front-scheme>
                        <local-scheme>
                        </local-scheme>
                   </front-scheme>
                   <back-scheme>
                        <distributed-scheme>
                             <backing-map-scheme>
                                  <read-write-backing-map-scheme>
                                       <internal-cache-scheme>
                                            <local-scheme>
                                            </local-scheme>
                                       </internal-cache-scheme>
                                       <cachestore-scheme>
                                            <class-scheme>
                                                 <class-name>spring-bean:partyCacheStore</class-name>
                                            </class-scheme>
                                       </cachestore-scheme>
                                  </read-write-backing-map-scheme>
                             </backing-map-scheme>
                        </distributed-scheme>
                   </back-scheme>
                   <autostart>true</autostart>
              </near-scheme>
         </caching-schemes>
    </cache-config>and the control cache config looks like:
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd" >
    <cache-config>
         <caching-scheme-mapping>
         <cache-mapping>
              <cache-name>CONTROL_CACHE</cache-name>
              <scheme-name>control_cache</scheme-name>
         </cache-mapping>
         </caching-scheme-mapping>
         <caching-schemes>
              <near-scheme>
                   <scheme-name>control_cache</scheme-name>
                   <service-name>controlCacheService</service-name>
                   <!-- a sensible default ? -->
                   <thread-count>5</thread-count>
                   <front-scheme>
                        <local-scheme>
                             <high-units>100</high-units>
                        </local-scheme>
                   </front-scheme>
                   <back-scheme>
                        <distributed-scheme>
                             <backing-map-scheme>
                                  <read-write-backing-map-scheme>
                                  </read-write-backing-map-scheme>
                             </backing-map-scheme>
                        </distributed-scheme>
                   </back-scheme>
                   <autostart>true</autostart>
              </near-scheme>
         </caching-schemes>
    </cache-config>They have different service names but I'm guessing this isn't what you mean and I thought I was using write-behind but again I'm guessing my config is not correct?

  • Loading data into infocube in bi 7.0 from flat file.

    Hello  All,
    I need the complete procedure for loading data into the infocube from flat file in bi 7.0 i.e.using transformation,DTP etc.
    Please help me with some gud documents.

    Hi Pratighya,
    Step by step procedure for loading data from flat file.
    1. Create the infoobjects you might need in BI
    2. Create your target infoprovider
    3. . Create a source system
    4. Create a datasource
    5. Create and configure an Infopackage that will bring your records to the PSA
    6. Create a transformation from the datasource to the Infoprovider
    7. Create a Data Transfer Process (DTP) from the datasource to the Infoprovider
    8. Schedule the infopackage
    9. Once succesful, run the DTP
    10. This will fill your target.
    Hope this helps
    Regards
    Karthik

Maybe you are looking for

  • Can I scan to PDF using Adobe Reader X?

    I want Reader to manage my scans.  Can I scan to PDF using Reader X, or is that only a feature in Acrobat.  If so, how can I use it as the default manager for my scanner.  I am using a Fujitsu FI-6140. 

  • Mail keeps getting stuck checking mail and just sits there

    My dad's dual G5 is having weird mail problems. When you open Mail, it initiates the checking the mail procedure. The Activity Viewer says that it is checking X out of 1700 at this point and it gets to 71 and then hangs on one spam message written in

  • Extended notification: Filter more Tasks

    Hi, I am using the extended notification. I have already a filter for one task and know how to add some entries. Is there also the possibility working to use wildcards like TS000*? Is it possible? thanks, Vanessa

  • X230 received, few issues (keyboard, screen). Thoughts?

    First, it arrived with a broken key (left Ctrl fell out when I opened the lid).  I have a new keyboard on order (they said 2 business days to arrive), so no biggie there.  But is the keyboard issue common? Second, the screen has a bright spot in the

  • Need help creating Watermark

    I am trying to create a "fancier" watermark for my photos.   Something that is 2 or 3 lines, maybe multiple fonts and colors.  I can do it individually in PSE 7 but can't figure out how to add it quickly to all photos in a set.  In the "Process Multi